Universität Bielefeld
- Technische Fakultät
- AG Wissensbasierte Systeme
Jahresübersicht 2005
Veröffentlichungen des Jahres 2005 inklusive aller verfügbaren Abstracts
Becker, C., Nakasone, A., Prendinger, H., Ishizuka, M. & Wachsmuth, I.
Physiologically interactive gaming with the 3D agent Max
International Workshop on Conversational Informatics, in conj. with JSAI-05
Kitakyushu, Japan (pp. 37-42), 2005.
Abstract:
Physiologically interactive (or affective) gaming refers to research on the evocation
and detection of emotion during game play [21]. In this paper, we first describe the two building
blocks of our approach to affective gaming. The building blocks correspond to two independently
conducted research strands on affective human-computer interaction: one on an emotion simulation
system for an expressive 3D humanoid agent called Max, which was designed at the University of
Bielefeld [13, 2]; the other one on a real-time system for empathic (agent) feedback that is based on
human emotional states derived from physiological information, and developed at the University
of Tokyo and the National Institute of Informatics [19]. Then, the integration of both systems is
motivated in the setting of a cards game called Skip-Bo that is played by a human game partner
and Max. Physiological user information is used to enable empathic feedback through non-verbal
behaviors of the humanoid agent Max. With regard to the new area of Conversational Informatics
we discuss the measurement of human physiological activity in game interactions and non-verbal
agent behavior.
Becker, C., Prendinger, H., Ishizuka, M. & Wachsmuth, I.
Empathy for Max (preliminary project report)
The 2005 International Conference on Active Media Technology (AMT-05)
Takamatsu, Kagawa, Japan (pp. 541-545), 2005.
Abstract:
This paper first describes two independently conducted
research strands on affective human-computer interaction:
one on an emotion simulation system for an expressive 3D
humanoid agent called Max, which was designed at the
University of Bielefeld [8, 2]; the other one on a real-time
system for empathic (agent) feedback that is based on human
emotional states derived from physiological information,
and developed at the University of Tokyo and the National
Institute of Informatics [15]. Then, the integration
of both systems is suggested for the purpose of realizing a
highly believable agent with empathic qualities.
Becker, C., Prendinger, H., Ishizuka, M. & Wachsmuth, I.
Evaluating affective feedback of the 3D agent Max in a competitive cards game
The First International Conference on Affective Computing and Intelligent Interaction (ACII-05),
Beijing, China (pp. 466-473). Berlin: Springer (LNCS 3784), 2005.
Abstract:
Within the field of Embodied Conversational Agents (ECAs),
the simulation of emotions has been suggested as a means to enhance
the believability of ECAs and also to effectively contribute to the goal
of more intuitive human{computer interfaces. Although various emotion
models have been proposed, results demonstrating the appropriateness
of displaying particular emotions within ECA applications are scarce or
even inconsistent. Worse, questionnaire methods often seem insufficient
to evaluate the impact of emotions expressed by ECAs on users. There-
fore we propose to analyze non-conscious physiological feedback (bio-
signals) of users within a clearly arranged dynamic interaction scenario
where various emotional reactions are likely to be evoked. In addition
to its diagnostic purpose, physiological user information is also analyzed
online to trigger empathic reactions of the ECA during game play, thus
increasing the level of social engagement. To evaluate the appropriateness
of different types of affective and empathic feedback, we implemented a
cards game called Skip-Bo, where the user plays against an expressive
3D humanoid agent called Max, which was designed at the University
of Bielefeld [6] and is based on the emotion simulation system of [2].
Work performed at the University of Tokyo and NII provided a real-
time system for empathic (agent) feedback that allows one to derive user
emotions from skin conductance and electromyography [13]. The find-
ings of our study indicate that within a competitive gaming scenario,
the absence of negative agent emotions is conceived as stress-inducing
and irritating, and that the integration of empathic feedback supports
the acceptance of Max as a co-equal humanoid opponent.
Heumer, G., Schilling, M. & Latoschik, M. E.
Automatic data exchange and synchronization for knowledge-based intelligent virtual environments
Proceedings of the IEEE VR2005,
Bonn, Germany, March 2005 (pp. 43-50), 2005.
Kopp, S.
The spatial specificity of iconic gestures.
In Klaus Opwis and Iris-Katharina Penner (eds.): Proceedings of KogWis05,
The German Cognitive Science Conference (pp. 112-117). Basel: Schwabe, 2005.
Abstract:
Humans use spontaneous gestures when communicating. But
what these gestures convey is still an open question and
several findings indicate that they fall short of communicating
semantic information. This paper presents a study in which
naive observers had to draw images of what they saw in
isolated iconic gestures. The detailed analyses of these
drawings showed that observers were able to reliably extract
visuospatial information from the gestures, with different
hand shapes, movements, or hand orientations being
differently salient and interpretable. In contrast to previous
findings, these results suggest that iconic gestures can reach a
level of specificity that makes them to an expedient means of
conveying visuospatial information.
Kopp, S., Gesellensetter, L., Krämer, N., & Wachsmuth, I.
A conversational agent as museum guide -- design and evaluation of a real-world application
In Panayiotopoulos et al. (eds.): Intelligent Virtual Agents (pp. 329-343).
Berlin: Springer-Verlag (LNAI 3661), 2005.
Abstract:
This paper describes an application of the conversational agent Max
in a real-world setting. The agent is employed as guide in a public computer
museum, where he engages with visitors in natural face-to-face communication,
provides them with information about the museum or the exhibition, and conducts
natural small talk conversations. The design of the system is described
with a focus on how the conversational behavior is achieved. Logfiles from interactions
between Max and museum visitors were analyzed for the kinds of
dialogue people are willing to have with Max. Results indicate that Max engages
people in interactions where they are likely to use human-like communication
strategies, suggesting the attribution of sociality to the agent.
Kranstedt, A., & Wachsmuth, I.
Incremental Generation of Multimodal Deixis Referring to Objects
Proceedings of the 10th European Workshop on Natural Language Generation (ENLG 2005),
Aberdeen, UK, august 2005.
Abstract:
This paper describes an approach for the generation
of multimodal deixis to be uttered by an anthropomorphic
agent in virtual reality. The proposed
algorithm integrates pointing and definite description.
Doing so, the context-dependent discriminatory
power of the gesture determines the contentselection
for the verbal constituent. The concept
of a pointing cone is used to model the region singled
out by a pointing gesture and to distinguish
two referential functions called object-pointing and
region-pointing.
Latoschik, M. E.
A user interface framework for multimodal VR interactions
Proceedings of the IEEE seventh International Conference on Multimodal Interfaces (ICMI 2005),
Trento, Italy, October 2005 (pp. 76-83), 2005.
Latoschik, M. E., Biermann, P. & Wachsmuth, I.
Knowledge in the loop: Semantics representation for multimodal simulative environments
Proceedings of the 5th International Symposium on Smart Graphics 2005 , (pp. 25-39).
Berlin: Springer (LNCS 3638), 2005.
Latoschik, M. E., Biermann, P. & Wachsmuth, I.
High-level semantics representation for intelligent simulative environments
Proceedings of the IEEE VR2005,
Bonn, Germany, March 2005 (pp. 283-284).
N. Leßmann, S. Kopp
Engagement in collaborative construction tasks with Max.
In AAMAS 2005 Workshop Proceedings: Creating Bonds with ECAs.,
Utrecht, The Netherlands (pp.76-83).
Abstract:
Max is a human-size conversational agent that employs synthetic speech, gesture, gaze, and facial display to act in cooperative construction tasks taking place in immersive virtual reality. In the mixed-initiative dialogs involved in our research scenario, turn-taking abilities and dialog competences play a crucial role for Max to appear as a convincing multimodal communication partner. The way how they rely on Max's perception of the user and, in special, how turn-taking signals are handled in the agent's cognitive architecture is the focus of this paper.
Sowa, T. & Wachsmuth, I.
A model for the representation and processing of shape in coverbal iconic gestures
In K. Opwis & I.-K. Penner (Eds.): Proc. of KogWis05 (pp. 183-188).
Basel: Schwabe Verlag, 2005.
Wachsmuth, I.
Studying situated communication with an embodied agent
Proc. of the Twenty-Seventh Annual Conference of the Cognitive Science Society,
Stresa, Italy, July 2005 (p. 44), 2005.
Wachsmuth, I.
Multimodale Interaktion in der Mensch-Maschine-Kommunikation
In L. Urbas & Ch. Steffens (Hrsg.):
Zustandserkennung und Systemgestaltung - 6. Berliner Werkstatt Mensch-Maschine-Systeme
(ZMMS Spektrum, Band 19, pp. 1-6). Düsseldorf: VDI, 2005.
Wachsmuth, I.
"Ich, Max"- Kommunikation mit künstlicher Intelligenz
In Ch. Herrmann, M. Pauen, J. Rieger & S. Schicktanz (Hrsg.):
Bewusstsein: Philosophie, Neurowissenschaften, Ethik (pp. 329-354).
München: Wilhelm Fink Verlag (UTB), 2005.
Wachsmuth, I.
Kommunikation und Körper (Embodied Communication)
In G. Graumann (Hrsg.): Beiträge zum Mathematikunterricht 2005 (pp. 41-47).
Hildesheim: Franzbecker Verlag, 2005.
Wachsmuth, I.
Computersimulation in der mathematikdidaktischen Grundlagenforschung
In Ch. Kaune, I. Schwank & J. Sjuts (Hrsg.):
Mathematikdidaktik im Wissenschaftsgefüge (Festschrift für Elmar Cohors-Fresenborg).
Osnabrück: Forschungsinstitut für Mathematikdidaktik (im Druck).
Wachsmuth, I. & Knoblich, G.
Embodied communication in humans and machines
AI Magazine 26(2): 85-86, 2005.
Wachsmuth, I. & Knoblich, G.
Embodied communication in humans and machines - a research agenda
Artificial Intelligence Review 24(3-4): 517-522, 2005.
Abstract:
The challenge to develop an integrated perspective of embodiment in communication
has been taken up by an international research group hosted by Bielefeld
University's Center for Interdisciplinary Research (ZiF) from October, 2005 through
September, 2006. An international conference was held there on 12-15 January, 2005
to define a research agenda that will explicitly address Embodied Communication in
Humans and Machines.