PASION @ UNIBI
In january 2006, 17 institutions from 8 european countries started a 4 year EU-funded project to explore novel forms of social group interactions. Their goal is to discover, how current and future technical tools could be used to realize communication between spatially distributed user groups. The PASION-project (PASION: Psychological Augmented Social Interaction Over Networks) emphasizes psychological indicators and models for social structures created during such interactions. Particular important is the question, how novel technical methods may provide an extended range of possible individual as well as group interactions beyond the use of speech as a means of communication
Existing examples of such forms of interactions enabled by new technologies are, e.g., emoticons. As icons, these short character sequences represent individual affections of a communication partner. Such interactions could only emerge due to the wide availability of text-based communication technologies, e.g., based on internet tools or mobile devices. Similar developments in the area of group interaction can be identified in online chat or conferencing systems, which, as an example, indicate virtual attendance or presence of complete groups using individual symbols on a communication partner's display.
Interactive Social Displays (ISDs)
The PASION-project follows this general direction and develops novel forms of interaction for tomorrows means of communication. At Bielefeld University's Artificial Intelligence and Virtual Reality Lab (AI & VR Lab), these forms of interactions are realized using intelligent virtual rooms. Here, the lab's scientists have developed so called Interactive Social Displays (ISD) as a first prototype. The ISD concept uses a symbiosis of Artificial Intelligence and Virtual Reality techniques which enables users to interact with each other while they are surrounded by realistic three dimensional computer generated scenes. Already available forms of audio/ video communication become tangible in three dimensions. Users are enabled to map the real-time user data to the ISDs using a multimodal speech and gesture interface to configure and position the ISDs in their environment. (see figure 2 and 3). Contact to available technologies is provided by mobile and desktop based systems which have been developed alongside the VR-based prototype. Figure 1 exemplarily illustrates design alternatives during prototype development.
User data relevant for derivation of socia indicators is particularly important in PASION. If a communication partner agrees to transmit such data, e.g., captured by biosensors, AI technologies can be used to analyze his/her emotional state or mood and to map this information to the appropriate ISDs as an additional source of information about individual users or user groups. Such techniques would be useful in manifold situations, e.g., during conflict management and prevention or to enhance interaction quality. An example for the latter are teaching or lecture situations and the possibility to instantly react on feedback, e.g., of listeners' decreasing attention etc.
An additional promising possibility is the presentation and rendering of such communication partner's information using so called avatars (s. figure 1 left). Their anthropomorphic, humanoid appearance and the resulting capability to directly display emotional states and behaviors can be used in an intercultural context. In, admittedly fictional scenarios, one goal is the utilization of avatar capabilities for intercultural translation of emotions and moods between members of different social cultures. Exactly an ideal scenario for an international project like PASION.
Project team @ UNIBI
Project leaders
Deputy project leader
Assistants
Student assistants
- Daniel Basa
- Joachim Dinkelmann
- Benjamin Dosch
- Ferdinand Eisenkeil
- Jan Hammerschmidt
- Conrad Lee
- Nikita Mattar
- Marc Paffen
- Juliane Reich
- Dennis Wiebusch
Bibliography
-
Pfeiffer, T., Latoschik, M.E. & Wachsmuth, I. (2009).
Evaluation of Binocular Eye Trackers and Algorithms for 3D Gaze Interaction in Virtual Reality Environments.
Journal of Virtual Reality and Broadcasting, 5 (16), dec.
Paper -
Pfeiffer, T., Latoschik, M. & Wachsmuth, I. (2008).
Conversational Pointing Gestures for Virtual Reality Interaction: Implications from an Empirical Study. In Proceedings of the IEEE Virtual Reality 2008, 281-282, March 8 - 12. Reno, Nevada.
Paper - Pfeiffer, T. & Wachsmuth, I. (2008). Social Presence: The Role of Interpersonal Distances in Affective Computer-Mediated Communication. In Proceedings of the 11th. International Workshop on Presence, 275-279, October. Padova, Italy: CLEUP Cooperativa Libraria Universitaria Padova.
-
T. Pfeiffer, M. Donner, M. E. Latoschik & I. Wachsmuth (2007). Blickfixationstiefe in stereoskopischen VR-Umgebungen: Eine vergleichende Studie
In M. E. Latoschik & B. Fröhlich (eds.): Vierter Workshop Virtuelle und Erweiterte Realität der GI-Fachgruppe VR/AR. pp. 113-124. Shaker, Aachen. ISBN 978-3-8322-6367-6.
Awarded 3rd place for best paper and presentation.
preprint (pdf, 1006 kB) bib -
T. Pfeiffer & M. E. Latoschik (2007).
Interactive Social Displays.
Poster presentation at The IPT-EGVE.
poster (pdf) bib -
T. Pfeiffer & M. E. Latoschik (2007).
Interactive Social Displays.
Poster presentation at The 3D UI 2007.
abstract (pdf) poster (pdf) video