- Florian Grond, Thomas Hermann, Vincent Verfaille, Marcelo Wanderley (2009).
Towards methods for effective ancilliary gesture sonification of clarinetists.
In Stefan Kopp and Ipke Wachsmuth (Ed.), Proc. 8th Int. Gesture Workshop, Springer Verlag, Berlin, Heidelberg, (submitted)
- Gerold Baier, Thomas Hermann (2009).
Sonification: Listen to Brain Activity
Music that works - Contributions of biology, neurophysiology, psychology, sociology, medicine and musicology, Springer
, Haas, Roland and Brandes, Vera (Ed.),
- Thomas Hermann (2009).
Sonifikation hochdimensionaler Daten - Funktionaler Klang zum Erkenntnisgewinn
Funktionale Klänge, transcript Verlag
, Georg Spehr (Ed.), , p. 67-85, no. 2, Sound Studies
- Thomas Hermann (2009).
Sonification and Sonic Interaction Design for the Broadband Society.
In B. Sapio and L. Haddon and E. Mante-Meijer and L. Fortunati and T. Turk and E. Loos (Ed.), The good, the bad and the challenging: The user and the future of information and communication technologies: Conference proceedingsp. 887-892, COST 298, ABS-Center, d.o.o. Koper, Slovenia
Abstract: Imagine a huge dataset of a public census - or medical data - or the worldwide Internet traffic. What do you hear? Obviously, we are not very familiar with the use of our listening capabilities when investigating large amounts of information! The typical data analyst is indeed confronted with large visual displays showing visualizations in front of a (concerning information value) rather silent computer. This is interesting since sound plays a highly important role in most real-world contexts, e.g. to monitor complex processes, to analyze complex systems, to selectively direct our attention, to allow us to gain insight into systems beyond the surface. Sonification, the auditory display of information makes arbitrary data accessible by our listening skills and addresses complementary modes of understanding which put dynamic instead of static features into the fore, which are well connected to interaction. Sonification can play an important role for the broadband society, e.g. to increase awareness of network behavior, our virtual neighborhood, to feel connected without being bound to a visual display. The paper will introduce, demonstrate and discuss the utility of sonification in sonic interaction design from monitoring and analysis tasks, interactive biofeedback to interfaces for visually impaired, introduce the concept of Sonic Overloading, and furthermore relate sonification to expected trends in the broadband society.
- Angelika Dierker, Till Bovermann, Marc Hanheide, Thomas Hermann, Gerhard Sagerer (2009).
A Multimodal Augmented Reality System for Alignment Research.
Proceedings of the 13th International Conference on Human-Computer Interaction, Springer, New York, Heidelberg
Abstract: In this paper we present the Augmented Reality-based Interception Interface (ARbInI), a multimodal Augmented Reality (AR) system to investigate effects and structures of human-human interaction in collaborative tasks. It is introduced as a novel methodology to monitor, record, and simultaneously manipulate multimodal perception and to measure so-called alignment signals. The linguistic term 'alignment' here refers to automatic and unconscious processes during communication of interactants. As a consequence of these processes, the structure of communication between the two interactants conforms to each other (for example, they use the same terms, gestures, etc.). Alignment is a debated model for communication in the community [1] and here we strive for providing novel means to study it by instrumenting the interaction channels of human interactants [2]. AR allows for a very close coupling between the user and a technical system. ARbInI adds to this a decoupling mechanism between two interacting users from the outside world and from each other via cameras and head-mounted displays (resp. microphone and headphones). This allows to monitor and record the exact perceived auditory and visual stimuli. This furthermore adds full control to the audiovisual input of the subjects: for instance, AR allows the manipulalion of shown virtual objects on top of physical objects by displaying a different size, shape or level of detail to the participants. Thus, the ARbInI has full control on the visual and auditory input of the subjects by means of the perceptual decoupling. With the help of visual and auditory AR techniques it is possible to manipulate the stimuli, for example by selectively changing the size, color or shape of virtual objects that are augmented in the views of cooperating users. Using ARToolKit markers attached to physical objects we make full use of augmented reality to realize physical interaction with virtual objects in our studies. In this paper, we also present VideoDB as a scenario. It focuses on the task to collaboratively organize and arrange multimodal video snippets. We discuss its
potential regarding the recording and investigation of alignment.
- Christian Mertes, Angelika Dierker, Thomas Hermann, Marc Hanheide, Gerhard Sagerer (2009).
Enhancing Human Cooperation with Multimodal Augmented Reality.
Proceedings of the 13th International Conference on Human-Computer Interaction, Springer, New York, Heidelberg
Abstract: Humans use an impressive variety of ways to communicate. However, technology has advanced to the point where it becomes interesting to think about complementing these natural communication skills with artificial ones. Augmented reality provides a good means to do this. In this work, we support two users in joint task situations by displaying to each user their partner's visual attention focus onto video see-through head-mounted displays. This can be done by displaying the gaze direction directly or by highlighting the objects being viewed. This data can also be sonified (i.e. displayed via sound). We have implemented four distinct modes of data presentation that can be used either simultaneously or separately: (a) We highlight virtual objects by changing their color when they are in the partner's view. This uses a configurable temporal envelope to enable the display of a recent attention history. (b) We display the partner's field of view on the interaction surface. Using an optional surrounding color gradient, the user's gaze direction can be intuitively guided towards his partner's gaze, just as if there were spotlights emanating from his partner's eyes. (c) Furthermore, we implemented an event-based sonification of objects leaving and entering the partner's view, and (d) we implemented a continuous sonification, mapping the horizontal position of the center of focus, its height, its proximity to one's own center of focus and the speed of the partner's visual movement to different parameters of real-time synthesized sound. We investigated (a) highlighting, and (c) event sonification, modes using an object-choice task. We found that 94 % of the subjects rated the visualization as helpful, while the simple sonification data presentation was not perceived to be helpful. A full account on the approaches and the results of the preliminary study will be given in the paper.
- Angelika Dierker, Christian Mertes, Thomas Hermann, Marc Hanheide, Gerhard Sagerer (2009).
Mediated Attention with Multimodal Augmented Reality.
Proceedings of the International Conference on Multimodal Interfaces (ICMI), Workshop on Machine Learning for Multi-modal Interaction, Cambridge, Massachusetts, USA, (submitted)
- Tobias Grosshauser, Thomas Hermann (2009).
The Sonified Music Stand - An Interactive Sonification System for Musicians.
In Fabien Gouyon, \'Alvaro Barbosa, Xavier Serra (Ed.), SMC 2009 - Proceedings of the 6th Sound and Music Computing Conferencep. 233-238, Porto, Portugal
Abstract: This paper presents the sonified music stand, a novel interface for musicians that provides real-time feedback for professional musicians in an auditory form by means of interactive sonification. Sonifications convey information by using non-speech sound and are a promising means for musicians since they (a) leave the visual sense unoccupied, (b) address the sense of hearing which is already used and in this way further trained, (c) allow to relate feedback information in the same acoustic medium as the musical output, so that dependencies between action and reaction can be better understood. This paper presents a prototype system together with demonstrations of applications that support violinists during musical instrument learning. For that a pair of portable active loudspeaker has been designed for the music stand and a small motion sensor box has been developed to be attached to the bow, hand or hand wrist. The data are sonified in real-time according to different training objectives. We sketch several sonification ideas with sound examples and give a qualitative description of using the system.
- Tobias Grosshauser, Thomas Hermann (2009).
Augmented Haptics - An Interactive Feedback System for Musicians.
In Altinsoy, E. and Jekosch, U. and Brewster, S. (Ed.), Haptic and Audio Interaction Design, Fourth International Workshop, HAID 2009, Dresden, German, September 11-12, 2009, Proceedings, Dresden, Germany
Abstract: This paper presents integrated tactiles (or vibrotactiles), a novel interface for movement and posture tuition that provides real-time feedback in a tactile form by means of interactive haptic feedback. Tactile feedback conveys information by using non-audio and non-visual form and is a promising means for movements in 3D-space. In this paper we demonstrate haptic augmentation for applications for musicians, since it (a) doesn't affect the visual sense, occupied by reading music and communication, (b) doesn't disturb in bang sensitive situations like concerts, (c) allows to relate feedback information in the same tactile medium as the output of the musical instrument, so that an important feedback channel for musical instrument playing is extended and trained supportive. Even more, instructions from the teacher and the computer can be transmitted directly and unobtrusively in this channel. This paper presents a prototype system together with demonstrations of applications that support violinists during musical instrument learning.
- Jan Anlauff, Thomas Hermann, Tobias Grosshauser, Jeremy Cooperstock (2009).
Modular tacTiles for Sonic Interactions with Smart Environments.
Haptic and Audio Interaction Design, Fourth International Workshop, HAID 2009, Dresden, German, September 11-12, 2009, Proceedings, Dresden, Germany, (submitted)
- S. Camille Peres, Virginia Best, Derek Brock, Christopher Frauenberger, Thomas Hermann, John G. Neuhoff, Lousie Valgerdaur, Barbara Shinn-Cunningham, Tony Stockman (2008).
Auditory Interfaces
ch. 5, Morgan-Kaufman
- Thomas Hermann (2008).
Auditory Interfaces: Technology of the Interface
ch. 5.2, Morgan-Kaufman
- Gerold Baier, Thomas Hermann, Ulrich Stephani (2008).
Sonification of Complex Biomedical Data
Journal of Biological Physics.
, (submitted, to appear 2008, Special Issue "Complexity in Neurology and Psychiatry")
- Gerold Baier, Thomas Hermann (2008).
Temporal Perspective from Auditory Perception
Simultaneity: Temporal Structures and Observer Perspectives, World Scientific
, Vrobel, Susie and Rössler, Otto E. and Marks-Tarlow, Terry (Ed.), , p. 348--363
Abstract: Dynamically complex diseases with distributed and multi-scale interacting physiological rhythms require a more refined temporal perspective of the scientific observer than is currently provided by visual displays of physiological data. We argue that sonification, the auditory inspection of experimental data, provides a unique approach to the representation of the temporal aspects of the data as it addresses the human sense of listening. The ear \u0301s capacity to detect temporal patterns of sameness and differences, of coincidence and coordination - widely exploited in listening to music and spoken language -creates a new temporal perspective in the scientific observer. We briefly describe some examples of sonifications of biomedical data and discuss their value in recovering the temporality of complex physiological processes. Auditory Gestalt formation can be exploited for the classification and differentiation of diseases. Finally, we stress the complementarity of auditory and visual representations and argue for combined audio-visual displays in order to adequately deal with complex phenomena, as in the case of dynamical diseases.
- Thomas Hermann, John Williamson, Roderick Murray-Smith, Yon Visell, Eoin Brazil (2008).
Sonification for Sonic Interaction Design.
In Rocchesso, Davide (Ed.), Proc. of the CHI 2008 Workshop on Sonic Interaction Design (SID), CHI, Florence
Abstract: This paper advocates a closer connection between the emerging field of sonic interaction design and that of sonification. We firstly discuss the issue of information conveyance by sound in everyday interactions, including HCI and product interaction design. Existing sonification techniques are examined, to identify principles for displaying information by sound during interaction, focusing particularly on Model-Based Sonification. We present two implementations: the Data Solids Sonification Model for exploratory data analysis, and the Shoogle system for mobile phone interactions. Both exemplify aspects of sonic interactions that connect well to the users' intuitions. Finally, the Sonic Interaction Atlas is introduced, a prototype community application that allows for the archival and organization of information in existing sonic interaction design cases, and for the generation of new scenarios during early-stage design research by aiding exploration of the suitability of different sonic interaction models. It is hoped that the Atlas may be useful for revealing the possibilities of physically-based sonic interaction methods that may connect well to users' intuition and innate capacities.
- Till Bovermann, Risto Koiva, Thomas Hermann, Helge Ritter (2008).
TUImod: Modular Objects for Tangible User Interfaces.
Proceedings of the 2008 Conference on Pervasive Computing, Syndey, Australia
Abstract: This paper describes the design and construction of TUImod, a modular system of basic elements generated by rapid-prototyping techniques that can be combined in various ways into human distinguishable and computer trackable physical objects with specific physical properties. The described system is used in our tangible desk environment for data exploration applications.
- Thomas Hermann, Gerold Baier, Ulrich Stephani, Helge Ritter (2008).
Kernel Regression Mapping for Vocal EEG Sonification.
In Katz, Brian (Ed.), Proc. Int. Conf. Auditory Display (ICAD 2008), ICAD, ICAD, France
Abstract: This paper introduces kernel regression mapping sonification (KRMS) for optimized mappings between data features and the parameter space of Parameter Mapping Sonification. Kernel regression allows to map data spaces to high-dimensional parameter spaces such that specific locations in data space with pre-determined extent are represented by selected acoustic parameter vectors. Thereby, specifically chosen correlated settings of parameters may be selected to create perceptual fingerprints, such as a particular timbre or vowel. With KRMS, the perceptual fingerprints become clearly audible and separable. Furthermore, kernel regression defines meaningful interpolations for any point in between. We present and discuss the basic approach exemplified by our previously introduced vocal EEG sonification, report new sonifications and generalize the approach towards automatic parameter mapping generators using unsupervised learning approaches.
- Thomas Hermann (2008).
Taxonomy and Definitions for Sonification and Auditory Display.
In Katz, Brian (Ed.), Proc. 14th Int. Conf. Auditory Display (ICAD 2008), ICAD, ICAD, Paris, France
Abstract: Sonification is still a young research field and many terms such as sonification, auditory display, auralization, audification have been used without a precise definition. Recent developments such as the introduction of Model-based Sonification, the establishing of interactive sonification and the increased interest in sonification from arts have raised the issue of revisiting the definitions towards a clearer terminology. This paper introduces a new definition for sonification and auditory display that emphasize necessary and sufficient conditions for organized sound to be called sonification. It furthermore suggests a taxonomy, and discusses the relation between visualization and sonification. A hierarchy of closed-loop interactions is furthermore introduced. This paper aims at initiating vivid discussions towards the establishing of a deeper theory of sonification and auditory display.
- Till Bovermann, Christof Elbrechter, Thomas Hermann, Helge Ritter (2008).
AudioDB: Get in Touch with Sounds.
Proc. ICAD 2008, ICAD, Paris, France
Abstract: Digital audio in its various appearances is ubiquitous in our everyday life. Searching and sorting of sounds collected in extensive databases e.g. sampling libraries for musical production or seismographical surveys is difficult and often bound to tight restrictions of the used standard human-computer interface technique of keyboard and mouse. Also the common technique of tagging sounds and other media files has the drawback that it needs descriptive words, which is a not to underestimated difficulty for sounds. We therefore created AudioDB, an intuitive human computer interface to interactively explore sound by representing them as physical artifacts (grains) on a tabletop surface. The system is capable for sonic sorting, grouping and selecting of sounds represented as physical artifacts, and can therefore serve as a basis for discussions on audio-related tasks in working teams. AudioDB however is not a special solution for problems appearing in a dedicated field of work, but is moreover designed as an easy-to-use multi purpose tool for audio-based information. As a side-effect, AudioDB can be used for grounding work on how humans handle digital information that is projected onto physical artifacts.
- Thomas Hermann (2008).
Daten hören - Sonifikation zur explorativen Datenanalyse
Sound Studies: Traditionen - Methoden - Desiderate, transcript Verlag
, Schulze, Holger (Ed.), , p. 209-228, no. 1, Sound Studies
Abstract: Stellen Sie sich einen großen Datensatz vor, zum Beispiel die Daten einer Volkszählung oder Börsendaten. Was hören Sie?
Diese Frage ist ungewöhnlich. Offenbar ist der Zugang zu Daten über das Hören für uns noch nicht alltäglich. Dieser Aufsatz horcht den Ursachen hierfür nach; motiviert, warum eine Verwendung unseres Hörsinns zur Untersuchung komplexer Daten ausgesprochen sinnvoll ist; beschreibt verschiedene Methoden, wie Daten klanglich dargestellt werden können; und zeigt wie wir sogar mit Daten interagieren können, um sie zum Klingen zu bringen.
- Thomas Hermann, Risto Koiva (2008).
tacTiles for Ambient Intelligence and Interactive Sonification.
In Pirhonen, Antti and Brewster, Stephen A. (Ed.), Haptic and Audio Interaction Design, Third International Workshop, HAID 2008, Jyväskylä, Finland, September 15-16, 2008, Proceedingsp. 91-101, Springer, Berlin, Heidelberg
Abstract: In this paper we introduce tacTiles, a novel wireless modular tactile sensitive surface element attached to a deformable textile, designed as a lay-on for surfaces such as chairs, sofas, floor or other furniture. tacTiles can be used as interface for human-computer interaction or ambient information systems. We give a full account on the hardware and show applications that demonstrate real-time sonification for process monitoring and biofeedback. Finally we sketch ideas for using tacTiles paired with sonification for interaction games.
- Louise Valger\symbol240ur Nickerson, Thomas Hermann (2008).
Interactive Sonifcation of Grid-based Games.
Proceedings of the Audio Mostly Conferencep. 27--34, Pite\aa, Sweden
Abstract: This paper presents novel designs for the sonification (auditory representation) of data from grid-based games such as Connect Four, Sudoku and others, motivated by the search for effective auditory representations that are useful for visually-impaired users as well as to support overviews in case that the visual sense is already otherwise allocated. Grid-based games are ideal to develop sonification strategies since they offer the advantage of providing an excellent test environment to evaluate the designs by measuring details of the interaction, learning, performance of the users, etc. We present in detail two new playable sonification-based audio games, and finally discuss how the approaches might generalise to general grid-based interactive exploration, e.g. for spreadsheet data.
- Thomas Hermann, Oliver Höner, Helge Ritter (2006).
AcouMotion - An Interactive Sonification System for Acoustic Motion Control.
In Gibet, Sylvie and Courty, Nicolas and Kamp, Jean-Francois (Ed.), Gesture in Human-Computer Interaction and Simulation: 6th International Gesture Workshop, GW 2005, Berder Island, France, May 18-20, 2005, Revised Selected Papersp. 312--323, Springer, Berlin, Heidelberg
Abstract: This paper introduces AcouMotion as a new hard-/software system for combining human body motion, tangible interfaces and sonification to a closed-loop human computer interface that allows non-visual motor control by using sonification (non-speech auditory displays) as major feedback channel. AcouMotion's main components are (i) a sensor device for measuring motion parameters (ii) a computer simulation to represent the dynamical evolution of a model world, and (iii) a sonification engine which generates an auditory representation of objects and any interactions in the model world. The intended applications of AcouMotion range from new kinds of sport games that can be played without visual displays and therefore may be particularly interesting for people with visual impairment to further applications in data mining, physiotherapy and cognitive research. The first application of AcouMotion presented in this paper is Blindminton, a sport game similar to Badminton which is particularly adapted to the abilities of people with visual impairment. We describe our current system and its state of development, and we present first sound examples for interactive sonification using an early prototype. Finally, we discuss some interesting research directions based on the fact that AcouMotion binds auditory stimuli and body motion, and thus can represent a counterpart to the Eye-tracker device that exploits the binding of visual stimuli and eye-movement in cognitive research.
- Thomas Hermann, Stella Paschalidou, Dirk Beckmann, Helge Ritter (2006).
Gestural Interactions for Multi-parameter Audio Control and Audification.
In Gibet, Sylvie and Courty, Nicolas and Kamp, Jean-Francois (Ed.), Gesture in Human-Computer Interaction and Simulation: 6th International Gesture Workshop, GW 2005, Berder Island, France, May 18-20, 2005, Revised Selected Papersp. 335--338, Springer, Berlin, Heidelberg
Abstract: This paper presents an interactive multi-modal system for real-time multi-parametric gestural control of audio processing applications. We claim that this can ease the use / control of different tasks and for this we present the following as a demonstration: (1) A musical application, i.e. the multi-parametric control of digital audio effects, and (2) a scientific application, i.e. the interactive navigation of audifications. In the first application we discuss the use of PCA-based control axes and clustering to obtain dimensionality reduced control variables. In the second application we show how the tightly closed human-computer loop actively supports the detection and discovery of features in data under analysis.
- Oliver Höner, Thomas Hermann (2006).
BISp-Jahrbuch 2006
ch. Entwicklung und Evaluation eines sonifikationsbasierten Gerätes zur Leistungsdiagnostik und Trainingssteuerung für den Sehgeschädigten-Leistungssport, Bundesinstitut für Sportwissenschaft
Abstract: In dem interdisziplinären Projekt wird unter Einbindung der Methode der interaktiven Sonifikation (Hermann & Hunt, 2005) ein neuer Weg für den Sehgeschädigten-Leistungssport beschritten, der mittlerweile vorhandene technische Möglichkeiten akustischer Datenpräsentation zur Entwicklung eines Leistungstests ,,TAM'' (Test for audio-motor performance) im Goalball nutzt. Dabei wird weniger eine Unterstützung der Bewegungsvorstellung des Sportlers (im Sinne der ,,Bewegungs-Sonifikation'', vgl. Effenberg & Mechling, 1998), sondern vielmehr eine Unterstützung der Situationsvorstellung angestrebt, die über die auditive Darbietung von Umweltinformationen perspektivisch vielfältige Möglichkeiten der Interaktion mit der Umwelt ermöglicht. Die technische Basis für die Entwicklung von TAMP bildet ein interaktives System zur sonifikationsbasierten Bewegungskontrolle (,,AcouMotio'', Hermann, Höner & Ritter, 2006), das als Entwicklungskomponente eines sonifikationsbasierten Leistungstests seine erste Anwendung im Sport findet. Nach der Entwicklung des TAMP wurde dieser in einem zweiten Schritt mit den Spieler/innen der deutschen Goalball-Nationalmannschaften empirisch validiert.
- Matthias Kaper, Peter Meinicke, Horst M. Müller, Sabine Weiss, Holger Bekel, Thomas Hermann, Axel Saalbach, Helge Ritter (2006).
Neuroinformatic techniques in cognitive neuroscience of language
Situated Communication, Mouton de Gruyter
, Rickheit, Gert and Wachsmuth, Ipke (Ed.), , p. 265--286, vol. 166, Trends in Linguistics. Studies and Monographs [TiLSM]
Abstract: Processes of language comprehension can successfully be investigated by non-invasive electrophysiological techniques like electroencephalography (EEG). This article presents innovative applications of neuroinformatic techniques to EEG data analysis in the context of Cognitive Neuroscience of Language to gain deeper insights in the processes of the human brain. A variety of techniques ranging from principal component analysis (PCA), independent component analysis (ICA), coherence analysis, self-organizing maps (SOM), and sonification were employed to overcome the restrictions of traditional EEG data analysis, which only yield comparably rough ideas about brain processes. Our findings, for example, allow to provide insights in the variety within EEG data sets, perform single trial classification with high accuracy, and investigate communication processes between cell assemblies during language processing.
- Thomas Hermann, Gerold Baier, Ulrich Stephani, Helge Ritter (2006).
Vocal Sonification of Pathologic EEG Features.
In Stockman, Tony (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2006)p. 158--163, International Community for Auditory Display (ICAD), Department of Computer Science, Queen Mary, University of London, London, UK
Abstract: We introduce a novel approach in EEG data sonification for process monitoring and exploratory as well as comparative data analysis. The approach uses an excitory/articulatory speech model and a particularly selected parameter mapping to obtain auditory gestalts (or auditory objects) that correspond to features in the multivariate signals. The sonification is adaptable to patient-specific data patterns, so that only characteristic deviations from background behavior (pathologic features) are involved in the sonification rendering. Thus the approach combines data mining techniques and case-dependent sonification design to give an application-specific solution with high potential for clinical use. We explain the sonification technique in detail and present sound examples from clinical data sets.
- Gerold Baier, Thomas Hermann, Sven Sahle, Ulrich Stephani (2006).
Sonified Epilectic Rhythms.
In Stockman, Tony (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2006)p. 148--151, International Community for Auditory Display (ICAD), Department of Computer Science, Queen Mary, University of London, London, UK
Abstract: We describe techniques to sonify rhythmic activity of epileptic seizures as measured by human EEG. Event-based mapping of parameters is found to be informative in terms of auto- and cross-correlations of the multivariate data. For the study, a group of patients with childhood absence seizures are selected. We find consistent intra-patient conservation of the rhythmic pattern as well as inter-patient variations, especially in terms of cross-correlations. The sound synthesis is suitable for online sonification. Thus, the application of the proposed sonification in clinical monitoring is possible.
- Till Bovermann, Thomas Hermann, Helge Ritter (2006).
Tangible Data Scanning Sonification Model.
In Stockman, Tony (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2006)p. 77--82, International Community for Auditory Display (ICAD), Department of Computer Science, Queen Mary, University of London, London, UK
Abstract: In this paper we develop a sonification model following the Model-based Sonification approach that allows to scan high-dimensional data distributions by means of a physical object in the hand of the user. In the sonification model, the user is immersed in a 3D space of invisible but acoustically active objects which can be excited by him. Tangible computing allows to identify the excitation object (e.g. a geometric surface) with a physical object used as controller, and thus creates a strong metaphor for understanding and relating feedback sounds in response to the user's own activity, position and orientation. We explain the technique and our current implementation in detail and give examples at hand of synthetic and real-world data sets.
- Matthias Milczynski, Thomas Hermann, Till Bovermann, Helge Ritter (2006).
A Malleable Device with Applications to Sonification-based Data Exploration.
In Stockman, Tony (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2006)p. 69--76, International Community for Auditory Display (ICAD), Department of Computer Science, Queen Mary, University of London, London, UK
Abstract: This article introduces a novel human computer interaction device, developed in the scope of a Master's Thesis. The device allows continuous localized interaction by providing a malleable interaction surface. Diverse multi-finger as well as multi-handed manipulations can be applied. Furthermore, the device acts as a tangible user interface object, integrated into a tangible computing framework called tDesk. Software to convert the malleable element's shape into an internal surface representation has been developed. Malleable interactions are applied to a new Model-based Sonification approach for exploratory data analysis. High-dimensional data are acoustically explored via their informative interaction sound in result to the user's excitation.
- Till Bovermann, Thomas Hermann, Helge Ritter (2006).
A Tangible Environment for Ambient Data Representation.
In McGookin, David and Brewster, Stephen (Ed.), First International Workshop on Haptic and Audio Interaction Designp. 26--30, www.multivis.org, Glasgow, UK
Abstract: In this paper we develop an ambient information environment called AmbiD that allows the user to specify intuitively - by moving tangible objects on our tangible desk environment - which data sources shall be connected with different ambient information displays, and how important the information is to the user. We explain the used technique and our current implementation in detail and give examples of possible data sources, displays and their interconnection.
- Gerold Baier, Thomas Hermann, Ulrich Stephani (2006).
Multivariate Sonification of Epileptic Rhythms for Real-Time Applications.
, AES
Abstract: Auditory displays present a new platform to represent complex data sets. They provide efficient information about data features, for example, when monitoring or interacting with multivariate time series. The human auditory sense seems to be particularly optimized for the detection and interpretation of multiple rhythmic events in real time, which may be of practical importance in the context of the epileptic EEG.
- Oliver Höner, Thomas Hermann, Christian Grunow (2005).
Sonifikation -- Ein Hilfsmittel zur Taktikanalyse im Sportspiel?
Zur Vernetzung von Forschung und Lehre in Biomechanik, Sportmotorik und Trainingswissenschaft, Czwalina
, Gabler, H. and Göhner, U. and Schiebl, F. (Ed.), , p. 226--230, vol. 144
- Thomas Hermann, Andy Hunt (2005).
An Introduction to Interactive Sonification (Guest Editors' Introduction)
IEEE MultiMedia.
, vol. 12, no. 2, p. 20--24, IEEE, IEEE Computer Society Press Los Alamitos, CA, USA
Abstract: The research field of sonification, a subset of the topic of auditory display, has developed rapidly in recent decades. It brings together interests from the areas of data mining, exploratory data analysis, human-computer interfaces, and computer music. Sonification presents information by using sound (particularly nonspeech), so that the user of an auditory display obtains a deeper understanding of the data or processes under investigation by listening.
- Oliver Höner, Thomas Hermann (2005).
Selbststeuerung im Sportspiel mittels interaktiver Sonifikation
Selbststeuerung im Sport, Czwalina
, Seelig, H. and Göhner, W. and Fuchs, R. (Ed.), , p. 55, vol. 144
- Till Bovermann, Thomas Hermann, Helge Ritter (2005).
The Local Heat Exploration Model for Interactive Sonification.
In Brazil, Eoin (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2005)p. 85--91, ICAD, International Community for Auditory Display, Limerick, Ireland
Abstract: This paper presents a new sonification model for the exploration of topographically ordered high-dimensional data (multi-parameter maps, volume data) where each data item consists of a position and feature vector. The sonification model implements a common metaphor from thermodynamics that heat can be interpreted as stochastic motion of 'molecules'. The latter are determined by the data under examination, and 'live' only in the feature space. Heat-induced interactions cause acoustic events that fuse to a granular sound texture which conveys meaningful information about the underlying distribution in feature space. As a second ingredient of the model, data selection is achieved by a separated navigation process in position space using a dynamic aura model, such that heat can be induced locally. Both, a visual and an auditory display are driven by the underlying model. We exemplify the sonification by means of interaction examples for different high-dimensional distributions.
- Gerold Baier, Thomas Hermann, Oscar Manuel Lara, Markus Müller (2005).
Using sonification to detect weak cross-correlations in coupled excitable systems.
In Brazil, Eoin (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2005)p. 312--315, ICAD, International Community for Auditory Display, Limerick, Ireland
Abstract: We study cross-correlations in irregularly spiking systems. A single system displays spiking sequences that resemble a stochastic (Poisson) process. Linear coupling between two systems leaves the inter-spike interval distribution qualitatively unchanged but induces cross-correlations between the units. For strong coupling this leads to synchronization as expected but for weak coupling, both a good statistic and sonification reveal the presence of ``motifs'', preferred short firing sequences which are due to the deterministic spiking mechanism. We argue that the use of sonification for time series analysis is superior in the case where intrinsic non-stationarity of an experiment cannot be ruled out.
- Gerold Baier, Thomas Hermann, Marcus Müller (2005).
Polyrhythmic Organization of Coupled Nonlinear Oscillators.
IV' 05: Proceedings of the Ninth International Conference on Information Visualisation (IV'05)p. 5--10, IEEE Computer Society, Los Alamitos, CA, USA
Abstract: We study the rhythmic organization of coupled nonlinear oscillators. If oscillators with non-identical internal frequency are coupled, they generate a great variety of periodic and chaotic rhythmic patterns. Sonification of these patterns suggests their characterization in terms of polyrhythms: each oscillatory unit subdivides "measures" of equal or varying length differently. For the case of two coupled oscillators, the organization of these polyrhythms is exemplified as a function of the internal frequency ratio and the coupling strength. Some sonification strategies are presented which aid to detect complex rhythmic relationships between oscillators. The results may be of importance for the analysis of complex multivariate time series like human EEG.
- Oliver Höner, Thomas Hermann (2005).
`Listen to the ball!' - sonification-based sport games for people with visual impairment.
A.P.A.: a discipline, a profession, an attitude (Proceedings of the 15th International Symposium Adapted Physical Activity), IFAPA, Verona, Italy
- Oliver Höner, Thomas Hermann, Thomas Prokein (2005).
Entwicklung eines goalballspezifischen Leistungstests.
In Würth, S. and Panzer, S. and Krug, J. and Alfermann, D. (Ed.), Sport in Europap. 331, Feldhaus, Hamburg, Germany, ((Development of a goalball-specific performance test))
- Thomas Hermann, Helge Ritter (2005).
Model-based sonification revisited---authors' comments on Hermann and Ritter, ICAD 2002
ACM Trans. Applied Perception.
, vol. 2, no. 4, p. 559--563, ACM Press, New York, NY, USA
Abstract: We discuss the framework of Model-Based Sonification (MBS) and its contribution to a principled design of mediators between high-dimensional data spaces and perceptual spaces, particularly sound spaces. Data Crystallization Sonification, discussed in the reprinted paper, exemplifies the design of sonification models according to this framework. Finally, promising lines of development in this area are pointed out, concerning generalizations, applications, and open research directions.
- Thomas Hermann, Helge Ritter (2005).
Crystallization sonification of high-dimensional datasets
ACM Trans. Applied Perception.
, vol. 2, no. 4, p. 550--558, ACM Press, New York, NY, USA
Abstract: This paper introduces Crystallization Sonification, a sonification model for exploratory analysis of high-dimensional datasets. The model is designed to provide information about the intrinsic data dimensionality (which is a local feature) and the global data dimensionality, as well as the transitions between a local and global view on a dataset. Furthermore the sound allows to display the clustering in high-dimensional datasets. The model defines a crystal growth process in the high-dimensional data-space which starts at a user selected ``condensation nucleus'' and incrementally includes neighboring data according to some growth criterion. The sound summarizes the temporal evolution of this crystal growth process. For introducing the model, a simple growth law is used. Other growth laws which are used in the context of hierarchical clustering are also suited and their application in crystallization sonification offers new ways to inspect the results of data clustering as an alternative to dendrogram plots. In this paper, the sonification model is described and example sonifications are presented for some synthetic high-dimensional datasets.
- Thomas Hermann, Thomas Henning, Helge Ritter (2004).
Gesture Desk - An Integrated Multi-modal Gestural Workplace for Sonification.
In Camurri, Antonio and Volpe, Gualtiero (Ed.), Gesture-Based Communication in Human-Computer Interaction, 5th International Gesture Workshop, GW 2003 Genova, Italy, April 15-17, 2003, Selected Revised Papersp. 369--379, Gesture Workshop, Springer, Berlin, Heidelberg
Abstract: This paper presents the gesture desk, a new platform for a human-computer interface at a regular computer workplace. It extends classical input devices like keyboard and mouse by arm and hand gestures, without the need to use any inconvenient accessories like data gloves or markers. A central element is a gesture box containing two infrared cameras and a color camera which is positioned under a glass desk. Arm and hand motions are tracked in three dimensions. A synchronizer board has been developed to provide an active glare-free IR-illumination for robust body and hand tracking. As a first application, we demonstrate interactive real-time browsing and querying of auditory self-organizing maps (AuSOMs). An AuSOM is a combined visual and auditory presentation of high-dimensional data sets. Moving the hand above the desk surface allows to select neurons on the map and to manipulate how they contribute to data sonification. Each neuron is associated with a prototype vector in high-dimensional space, so that a set of 2D-topologically ordered feature maps is queried simultaneously. The level of detail is selected by hand altitude over the table surface, allowing to emphasize or deemphasize neurons on the map.
- Christian Lange, Thomas Hermann, Helge Ritter (2004).
Holistic Body Tracking for Gestural Interfaces.
In Camurri, Antonio and Volpe, Gualtiero (Ed.), Gesture-Based Communication in Human-Computer Interaction 5th International Gesture Workshop, GW 2003 Genova, Italy, April 15-17, 2003, Selected Revised Papersp. 132--139, International Gesture Workshop, Springer, Berlin, Heidelberg
Abstract: In this paper we present an approach to track a moving body in a sequence of camera images by model adaptation. The parameters of a stick figure model are varied by using a stochastic search algorithm. The similarity of rendered model images and camera images of the user are used as quality measure. A refinement of the algorithm is introduced by using combined stereo views and relevance maps to infer responsible joint angles from the difference of successive input images. Finally, the successful application of various versions of the algorithm on sequences of synthetic images is demonstrated.
- Peter Meinicke, Thomas Hermann, Holger Bekel, Horst M. Müller, S. Weiss, H. Ritter (2004).
Identification of Discriminative Features in EEG
Intelligent Data Analysis.
, vol. 8, no. 1, p. 97--107, (IOS Press)
- Jörg Martini, Thomas Hermann, Dario Anselmetti, Helge Ritter (2004).
Interactive Sonification for exploring Single Molecule Properties with AFM-based Force Spectroscopy.
In Hermann, Thomas and Hunt, Andy (Ed.), Proceedings of the International Workshop on Interactive Sonification (ISon 2004), Bielefeld University, Interactive Sonification Community, Bielefeld, Germany, (peer-reviewed article)
- Oliver Höner, Thomas Hermann, Christian Grunow (2004).
Sonification of Group Behavior for Analysis and Training of Sports Tactics.
In Hermann, Thomas and Hunt, Andy (Ed.), Proceedings of the International Workshop on Interactive Sonification (ISon 2004), Bielefeld University, Interactive Sonification Community, Bielefeld, Germany, (peer-reviewed article)
- Thomas Hermann, Andy Hunt (2004).
The Discipline of Interactive Sonification.
In Hermann, Thomas and Hunt, Andy (Ed.), Proceedings of the International Workshop on Interactive Sonification (ISon 2004), Bielefeld University, Interactive Sonification Community, Bielefeld, Germany, (peer-reviewed article)
- Tim W. Nattkemper, Walter Schubert, Thomas Hermann, Helge Ritter (2004).
A Hybrid System for Cell Detection in Digital Micrographs.
In Tilg, B. (Ed.), Biomedical Engineering, Proc. BIOMED 2004, ACTA Press, Innsbruck, Austria
Abstract: To analyze large sets of digital micrographs from high-throughput screening studies with constant accuracy, advanced image processing algorithms are necessary. In the literature, systems have been proposed applying model based fitting algorithms, morphological operators and arti ficial neural networks (ANN). Because single approaches show limited performance, we propose a hybrid system that combines the Hough transform with a multi-layer percep tron (MLP) network. Our results show, that the combina tion of both approaches improves the performance and the positions of cell bodies are obtained with increased sensi tivity and positive predictive value.
- Thomas Hermann, Helge Ritter (2004).
Sound and Meaning in Auditory Data Display
Proceedings of the IEEE (Special Issue on Engineering and Music -- Supervisory Control and Auditory Communication).
, vol. 92, no. 4, p. 730--741
Abstract: Auditory data display is an interdisciplinary field linking auditory perception research, sound engineering, data mining, and human-computer interaction in order to make semantic contents of data perceptually accessible in the form of (nonverbal) audible sound. For this goal it is important to understand the different ways in which sound can encode meaning. We discuss this issue from the perspectives of language, music, functionality, listening modes, and physics, and point out some limitations of current techniques for auditory data display, in particular when targeting high-dimensional data sets. As a promising, potentially very widely applicable approach, we discuss the method of model-based sonification (MBS) introduced recently by the authors and point out how its natural semantic grounding in the physics of a sound generation process supports the design of sonifications that are accessible even to untrained, everyday listening. We then proceed to show that MBS also facilitates the design of an intuitive, active navigation through "acoustic aspects", somewhat analogous to the use of successive two-dimensional views in three-dimensional visualization. Finally, we illustrate the concept with a first prototype of a "tangible" sonification interface which allows us to "perceptually map" sonification responses into active exploratory hand motions of a user, and give an outlook on some planned extensions.
- Thomas Hermann, Oliver Höner, Helge Ritter (2004).
Verfahren zur Steuerung eines auditiven Spiels und Vorrichtung hierzu
German Patent and Trade Mark Office.
, (Patent Number Code DE102004048583A1)
- Thomas Hermann, Helge Ritter (2004).
Neural Gas Sonification - Growing Adaptive Interfaces for Interacting with Data.
In Banissi, Ebad and Börner, Katy (Ed.), IV '04: Proceedings of the Information Visualisation, Eighth International Conference on (IV'04)p. 871--878, IEEE Computer Society, Washington, DC, USA
Abstract: In this paper we present an approach using incrementally constructed neural gas networks to 'grow' an intuitive interface for interactive exploratory sonification of high-dimensional data. The sonifications portray information about the intrinsic data dimensionality and its variation within the data space. The interface follows the paradigm of model-based sonification and consists of a graph of nodes that can be acoustically \'yexcited\'y with simple mouse actions. The sound generation process is defined in terms of the node parameters and the graph topology, following a physically motivated model of energy flow through the graph structure. The resulting sonification model is tied to the given data set by constructing both graph topology and node parameters by an adaptive, fully data-driven learning process, using a growing neural gas network. We report several examples of applying this method to static data sets and point out a generalization to the task of process analysis
- Andy Hunt, Thomas Hermann (2004).
The Importance of Interaction in Sonification.
In Barrass, Stephen and Vickers, Paul (Ed.), Proceedings of the Int. Conference on Auditory Display (ICAD 2004), International Community for Auditory Display (ICAD), ICAD, Sydney, Australia
Abstract: This paper argues for a special focus on the use of dynamic human interaction to explore datasets while they are being transformed into sound. We describe why this is a special case of both human computer interaction (HCI) techniques and sonification methods. Humans are adapted for interacting with their physical environment and making continuous use of all their senses. When this exploratory interaction is applied to a dataset (by continuously controlling its transformation into sound) new insights are gained into the data's macro and micro-structure, which are not obvious in a visual rendering. This paper reviews the importance of interaction in sonification, describes how a certain quality of interaction is required, provides examples of the techniques being applied interactively, and outlines a plan of future work to develop interaction techniques to aid sonification.
- Gerold Baier, Thomas Hermann (2004).
The Sonification of Rhythms in Human Electroencephalogram.
In Barrass, Stephen and Vickers, Paul (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2004), International Community for Auditory Display (ICAD), ICAD, Sydney, Australia
Abstract: We use sonification of temporal information extracted from scalp EEG to characterize the dynamic properties of rhythms in certain frequency bands. Sonification proves particularly useful in the simultaneous monitoring of several EEG channels. Our results suggest sonification as an important tool in the analysis of multivariate data with subtle correlation differences.
- Andy Hunt, Thomas Hermann, Sandra Pauletto (2004).
Interacting with Sonification Systems: Closing the Loop.
In Banissi, Ebad and Börner, Katy (Ed.), IV '04: Proceedings of the Information Visualisation, Eighth International Conference on (IV'04)p. 879--884, IEEE Computer Society, Washington, DC, USA
Abstract: This paper stresses the importance of the human user being tightly embedded within an interactive control loop for exploring data sets using sound. We consider the quality of interaction, and how this can be improved in computer systems by learning from real-world acoustic interactions. We describe how different sonification methods can utilise the human feedback loop to enhance the perception and analysis of the data under investigation. Some considerations are given regarding systems and applications.
- Thomas Hermann, Gerold Baier, Markus Müller (2004).
Polyrhythm in the Human Brain.
In Barrass, Stephen (Ed.), Listening to the Mind Listening - Concert of Sonifications at the Sydney Opera House, International Community for Auditory Display (ICAD), ICAD, Sydney, Australia
Abstract: Three complementary methods are used to analyze the dynamics of multivariate EEG data obtained from a human listening to a piece of music. The analysis yields parameters for a data sonification that conserves temporal and frequency relationships as well as wave intensities of the data. Multiple events taking place on different time scales are combined to a polyrhythmic display in real time.
- Thomas Hermann, Jan M. Drees, Helge Ritter (2003).
Broadcasting Auditory Weather Reports -- A Pilot Project.
In Brazil, Eoin and Shinn-Cunningham, Barbara (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2003)p. 208--211, International Community for Auditory Display (ICAD), Boston University Publications Production Department, Boston, MA, USA
Abstract: This paper reports on a pilot project between our research department and and a local radio station, investigating the use of sonification to render and present auditory weather forecasts. The sonifications include auditory markers for certain relevant time points, expected weather events like thunder, snow or fog and several auditory streams to summarize the temporal weather changes during the day. To our knowledge, this is the first utilization of sonification in a regular radio program. We introduce the sonification concept and present our design of the sonification which is oriented at combined perceptional salience and emotional truthfulness. Sound examples are given for typical weather situations in Germany and several prototypical weather conditions which tune to be connected with emotional value. We will report first experiences with this pilot project and feedback of the audience on ICAD since broadcast started in February 2003.
- Thomas Hermann, Christian Niehus, Helge Ritter (2003).
Interactive Visualization and Sonification for Monitoring Complex Processes.
In Brazil, Eoin and Shinn-Cunningham, Barbara (Ed.), Proceedings of the International Conference on Auditory Display (ICAD 2003)p. 247--250, International Community for Auditory Display (ICAD), Boston University Publications Production Department, Boston, MA, USA
Abstract: This paper introduces AVDisplay, a versatile auditory and visual display for monitoring, querying and accessing information about modules or processes in complex systems. In the context of a collaborative research effort (SFB360, artificial communicators) at Bielefeld University, a cognitive robotics system for human-machine interaction is being developed. The AVDisplay provides the central interface for monitoring and debugging this system, currently involving about 20 computers hosting more than 30 complex processes. The display is designed to provide a summary over the system's activities combining visualization and sonification techniques. The dynamic visualization allows inference of correlated activity of processes. A habituation simulation process automatically sets a perceptional focus on interesting and relevant process activities. The sonification part is designed to integrate emotional aspects -- if the system suffers from poor sensory quality, the sound conveys this by sounding uncomfortable.
- Tim W. Nattkemper, Thomas Hermann, Walter Schubert, Helge Ritter (2003).
Look & Listen: Sonification and Visualization of Multiparameter Micrographs.
Engineering in Medicine and Biology Society, 2003. Proceedings of the 25th Annual International Conference of the IEEEp. 1311--1314, IEEE EMBS, IEEE, Cancun, Mexico
Abstract: Multiparameter imaging techniques provide large numbers of high-dimensional image data in modern biomedical research. Besides algorithms for image registration, normalization and segmentation, new methods for interactive data exploration must be proposed and evaluated. We propose a new approach for auditory data representation, based on sonification. The approach is applied to a multiparameter image data set, generated with immunofluorescence techniques and compared to a conventional visualization approach and to a combination of both. For comparison, a psychophysical experiment was conducted, in which one standard evaluation procedure is modeled. Our results show, that all three approaches lead to comparable evaluation accuracies for all subjects. We conclude, that both, acoustical and visual approaches can be combined to display data sets of large dimensionality.
- Thomas Hermann, Claudia Nölker, Helge Ritter (2002).
Hand Postures for Sonification Control.
In Wachsmuth, Ipke and Sowa, Timo (Ed.), Gesture and Sign Language in Human-Computer Interaction: International Gesture Workshop, GW 2001, London, UK, April 18-20, 2001. Revised Papersp. 307--316, Springer, Berlin, Heidelberg
Abstract: Sonification is a rather new technique in human-computer interaction which addresses auditory perception. In contrast to speech interfaces, sonification uses non-verbal sounds to present information. The most common sonification technique is parameter mapping where for each data point a sonic event is generated whose acoustic attributes are determined from data values by a mapping function. For acoustic data exploration, this mapping must be adjusted or manipulated by the user. We propose the use of hand postures as a particularly natural and intuitive means of parameter manipulation for this data exploration task. As a demonstration prototype we developed a hand posture recognition system for gestural controlling of sound. The presented implementation applies artificial neural networks for the identification of continuous hand postures from camera images and uses a real-time sound synthesis engine. In this paper, we present our system and first applications of the gestural control of sounds. Techniques to apply gestures to control sonification are proposed and sound examples are given.
- Thomas Hermann (2002).
Sonification for Exploratory Data Analysis
Bielefeld University
- Thomas Hermann, Helge Ritter (2002).
Crystallization Sonification of High-dimensional Datasets.
In Nakatsu, R. and Kawahara, H. (Ed.), Proc. of the Int. Conf. on Auditory Displayp. 76--81, International Community for Auditory Display (ICAD), ICAD, Kyoto, Japan
Abstract: This paper introduces Crystallization Sonification, a sonification model for exploratory analysis of high-dimensional datasets. The model is designed to provide information about the intrinsic data dimensionality (which is a local feature) and the global data dimensionality, as well as the transitions between a local and global view on a dataset. Furthermore the sound allows to display the clustering in high-dimensional datasets. The model defines a crystal growth process in the high-dimensional data-space which starts at a user selected ``condensation nucleus'' and incrementally includes neighboring data according to some growth criterion. The sound summarizes the temporal evolution of this crystal growth process. For introducing the model, a simple growth law is used. Other growth laws which are used in the context of hierarchical clustering are also suited and their application in crystallization sonification offers new ways to inspect the results of data clustering as an alternative to dendrogram plots. In this paper, the sonification model is described and example sonifications are presented for some synthetic high-dimensional datasets.
- Thomas Hermann, Jan Krause, Helge Ritter (2002).
Real-Time Control of Sonification Models with an Audio-Haptic Interface.
In Nakatsu, R. and Kawahara, H. (Ed.), Proceedings of the International Conference on Auditory Displayp. 82--86, International Community for Auditory Display (ICAD), ICAD, Kyoto, Japan
Abstract: This paper presents a new interface for controlling sonification models. A haptic controller interface is developed which allows both to manipulate a sonification model, e.g. by interacting with it and to provide a haptic data representation. A variety of input types are supported with a hand-sized interface, including shaking, squeezing, hammering, moving, rotating and accelerating. The paper presents details on the interface under development and demonstrates application of the device for controlling a sonification model. For this purpose, the Data-Solid Sonification Model is introduced, which provides an acoustic representation of the local neighborhood relations in high-dimensional datasets for binary classification problems. The model is parameterized by a reduced data representation obtained from a growing neural gas network. Sound examples are given to demonstrate the device and the sonification model.
- Thomas Hermann, Peter Meinicke, Holger Bekel, Helge Ritter, Horst Müller, Sabine Weiss (2002).
Sonification for EEG Data Analysis.
In Nakatsu, R. and Kawahara, H. (Ed.), Proc. of the Int. Conf. on Auditory Displayp. 37--41, International Community for Auditory Display (ICAD), ICAD, Kyoto, Japan
Abstract: This paper presents techniques to render acoustic representations for EEG data. In our case, data are obtained from psycholinguistic experiments where subjects are exposed to three different conditions based on different auditory stimuli. The goal of this research is to uncover elements of neural processing correlated with high-level cognitive activity. Three sonifications are presented within this paper: spectral mapping sonification which offers a quite direct inspection of the recorded data, distance matrix sonification which allows to detect nonlinear long range correlations at high time resolution, and differential sonification which summarizes the comparison of EEG measurements under different conditions for each subject. This paper describes the techniques and presents sonification examples for experimental data.