Σάββατο 6 Δεκεμβρίου 2008

Emotion-Aware Natural Interaction

Rosalind Picard defined affective computing as “computing that relates to, arises from, or deliberately influences emotions.” Along with the more recent introduction of neighboring terms such as “human centered” or “anthropocentric,” “pervasive” and “ubiquitous” computing, computers are no longer deemed as number-crunching machines, but are approached as intelligent and adaptive tools or interfaces within our habitat, helping perform everyday tasks in a more intuitive and yet robust manner. In most cases, paralinguistic concepts such as mood, attitude, traits, and expressivity can adapt the user experience and present flexible and, therefore, more suitable results.

From an engineering point of view, researchers have been investigating different modalities and combinations to provide emotion or affect recognition components. The mapping of raw signals and related features to high-level concepts has been mainly driven from claims such as Ekman's who states that specific facial expressions can be universally recognized across cultures and ages. This claim lends itself well to the existing algorithms which classify content into discrete categories, and as a result pushed the area into creating training and testing data sets containing only the aforementioned expressions. In addition to this, these data sets usually contain cases of extreme expressivity, since these can be distinguished more easily and make it quite difficult to extend. However, everyday human-human and human-computer interactions hardly ever contain cases of extreme expressivity or clear occurrences of expressivity ranging from a neu-tral state to the visual, aural, or physiological apexes of an expression.


The proposed Special Issue aims to present state-of-the-art approaches in the fields of unimodal and multimodal affect analysis, and combine these with techniques that utilize a priori or just-in-time knowledge about the user, environment, or task contexts. Papers on emotion- and affect-aware applications are strongly encouraged, especially those discussing issues related to natural interaction and/or the tradeoff between unconstrained expressivity and robustness.

Topics of interest include, but are not limited to:

* Acoustic and linguistic analysis/feature extraction and recognition
* Visual (face, body, hand) analysis/feature extraction and recognition
* Uni/multimodal recognition/sensing of (blended) emotion, affect, and behavior
* Dynamic, temporal concepts, turn-taking
* Bridging feature extraction and recognition with knowledge sources and context
* Novel modalities for HCI (biosignals, haptics, etc.)
* Affect analysis “in the wild” (e.g., public spaces, groups, etc.)
* Protocols for evaluation of affect-aware systems
* Affect- and emotion-aware applications: design and implementation


Before submission authors should carefully read over the journal'sAuthor Guidelines, which are located at http://www.hindawi.com/journals/ahci/guidelines.html. Authors should follow the Advances in Human-Computer Interaction manuscript format described at the journal site http://www.hindawi.com/journals/ahci/. Prospective authors should submit an electronic copy of their complete manuscript through the Journal Manuscript Tracking System at http://www.hindawi.com/mts/, according to the following timetable:


Manuscript Due April 1, 2009

First Round of Reviews July 1, 2009

Publication Date October 1, 2009


Lead Guest Editor

o Kostas Karpouzis, Institute of Communication and Computer Systems, National Technical University of Athens, 15780 Zographou, Athens, Greece; kkarpou@cs.ntua.gr

Guest Editors

o Elisabeth Andre, Multimedia Concepts and Applications, Institute of Computer Science, University of Augsburg, 86135 Augsburg, Germany; andre@informatik.uni-augsburg.de
o Anton Batliner, Computer Science Department 5, Friedrich Alexander University, Erlangen-Nuremberg, 91058 Erlangen, Germany; batliner@informatik.uni-erlangen.de

Δεν υπάρχουν σχόλια: