Ipke Wachsmuth, Martin Fröhlich (Eds.)Gesture and Sign-Language in Human-Computer InteractionInternational Gesture Workshop Bielefeld, Germany, September, 1997 Proceedings |
Lecture Notes in Artificial Intelligence, Vol. 1371
Pages XI + 309
Springer-Verlag 1998
ISBN 3-540-64424-5
List Price: DM 74.00 - US$ 55.00
Our publischer - Springer-Verlag - is also offering an electronic version of the proceedings, including abstracts, and - for registered users - PDF fulltext documents of the articles via its LINKweb site
The URL for the Proceedings of the Bielefeld Gesture Workshop 1997 at Springer LINK is: http://link.springer.de/link/service/series/0558/tocs/t1371.htm. With some browsers you will be automatically redirected to this site within one minute .
Bibliographic data of the prodeedings
In the work on humanizing computer interaction, gesture and sign language have become a recent focus in advanced interface design. Artificial intelligence, neural networks, pattern recognition, and agent techniques are all having a significant impact on this area of research. Resulting from a three-day international workshop, held at Bielefeld University's Center for Interdisciplinary Research in September 1997, this book presents state-of-the-art contributions comprising gesture and sign- language recognition and synthesis, gesture semiotics, gesture and speech integration, and applications. The book can serve as a timely and comprehensive reference for researchers in any of the related disciplines, as well as for practitioners concerned with putting gesture and sign language to work in computer interfaces. It could further be of use as a supplementary text for courses on multimodal human-computer interaction.
Monica Bordegoni | Università di Parma |
Annelies Braffort | LIMSI - CNRS France |
Christophe Collet | LIMSI - CNRS France |
Holk Cruse | University of Bielefeld |
Franz Dotter | University of Klagenfurt |
Alistair Edwards | University of York |
Martin Fröhlich | University of Bielefeld |
Rachid Gherbi | LIMSI - CNRS France |
Hermann Hienz | Aachen University of Technology |
Caroline Hummels | Delft University of Technology |
Sotaro Kita | Max-Planck Institute for Psycholinguistics |
Thierry Lebourque | LIMSI - CNRS France |
Shan Lu | Communications Research Lab, MPT Japan |
Shuichi Nobe | Aoyama Gakuin University Japan |
Marilyn Panayi | City University London |
James Richardson | Université Paris-Sud |
Helge Ritter | University of Bielefeld |
David Roy | European Commission Brussels |
Ipke Wachsmuth | University of Bielefeld |
Research Challenges in Gesture: Open Issues and Unsolved Problems, page 1
Alan Wexelblat
Progress in Sign Language Recognition, page 13
Alistair D.N. Edwards
Movement Phases in Signs and Co-speech Gestures, and Their
Transcription by Human Coders, page 23
Sotaro Kita, Ingeborg van Gijn, and Harry van der Hulst
Classifying Two Dimensional Gestures in Interactive Systems, page 37
Axel Kramer
Are Listeners Paying Attention to the Hand Gestures of an
Anthropomorphic Agent? An Evaluation Using a Gaze Tracking Method, page 49
Shuichi Nobe, Satoru Hayamizu, Osamu Hasegawa, and Hideaki Takahashi
Gesture-Based and Haptic Interaction for Human Skill Acquisition, page 61
Monica Bordegoni and Franco De Angelis
High Performance Real-Time Gesture Recognition Using Hidden
Markov Models, page 69
Gerhard Rigoll, Andreas Kosmala, and Stefan Eickeler
Velocity Profile Based Recognition of Dynamic Gestures with
Discrete Hidden Markov Models, page 81
Frank G. Hofmann, Peter Heyer, and Günter Hommel
Video-Based Sign Language Recognition Using Hidden Markov Models, page 97 Marcell Assan and Kirsti Grobel
Corpus of 3D Natural Movements and Sign Language Primitives of
Movement, page 111
Sylvie Gibet, James Richardson, Thierry Lebourque, and
Annelies Braffort
On the Use of Context and A Priori Knowledge in Motion Analysis for
Visual Gesture Recognition, page 123
Karin Husballe Munk and Erik Granum
Automatic Estimation of Body Regions from Video Images, page 135
Hermann Hienz and Kirsti Grobel
Rendering Gestures as Line Drawings, page 147
Frank Godenschweger, Thomas Strothotte, and Hubert Wagener
Investigating the Role of Redundancy in Multimodal Input Systems, page 159
Karen McKenzie Mills and James L. Alty
Gesture Recognition of the Upper Limbs - From Signal to Symbol, page 173
Martin Fröhlich and Ipke Wachsmuth
Exploiting Distant Pointing Gestures for Object Selection in a
Virtual Environment, page 185
Marc Erich Latoschik and Ipke Wachsmuth
An Intuitive Two-Handed Gestural Interface for Computer Supported
Product Design, page 197
Caroline Hummels, Gerda Smets, and Kees Overbeeke
Detection of Fingertips in Human Hand Movement Sequences, page 209
Claudia Nölker and Helge Ritter
Neural Architecture for Gesture-Based Human-Machine-Interaction, page 219
Hans-Joachim Boehme, Anja Brakensiek, Ulf-Dietrich Braumann,
Markus Krabbes, and Horst-Michael Gross
Robotic Gesture Recognition, page 233
Jochen Triesch and Christoph von der Malsburg
Image Based Recognition of Gaze Directions Using Adaptive Methods, page 245
Axel Christian Varchmin, Robert Rae, and Helge Ritter
Towards a Dialogue System Based on Recognition and Synthesis of
Japanese Sign Language, page 259
Shan Lu, Seiji Igi, Hideaki Matsuo, and Yuji Nagashima
The Recognition Algorithm with Non-contact for Japanese Sign
Language Using Morphological Analysis, page 273
Hideaki Matsuo, Seiji Igi, Shan Lu, Yuji Nagashima, Yuji Takata, and
Terutaka Teshima
Special Topics of Gesture Recognition Applied in Intelligent Home
Environments, page 285
Markus Kohler
BUILD-IT: An Intuitive Design Tool Based on Direct Object
Manipulation, page 297
Morten Fjeld, Martin Bichsel, and Matthias Rauterberg
These are intendend to be browser independent pages. Please report any problems which occur while visiting this pages!