Difference between revisions of "Affective Computing"
From AIRWiki
m (→Project Proposals) |
m (→People) |
||
Line 9: | Line 9: | ||
== People == | == People == | ||
− | {{#ask: [[HasResArea::{{PAGENAME}}]] | format=ul}} | + | {{#ask: [[HasResArea::{{PAGENAME}}]] [[UserStatus::active]] | format=ul}} |
== Currently Active Projects== | == Currently Active Projects== |
Revision as of 13:29, 16 April 2012
Affective computing is devoted to analyze signals from people working or interacting with devices in order to understand their emotional or affective state. We are working both on the analysis of biosignals (in projects shared with the BioSignal Analysis research area), and on the analysis of interaction signals and, in general, of user behavior. In particular, we have faced these problems:
- Development of a model to detect from biosignals, use of keyboard, or movements, the preference of a person for a specific setting of a videogame
- Development of a model to detect from biosignals and movements the emotional attitude of a person involved in robotic rehabilitation
- Development of a model to detect from biosignals the emotional attitude of people looking at different images, previously characterized in terms of valence and arousal
- Development of models to detect emotional states from the use of keyboard and mouse during the interaction with a computer
- Development of models to detect emotional states of a car driver from biosignals, analysis of movements, analysis of the use of car commands
People
Currently Active Projects
Past Projects
- Affective VideoGames
- Driving companions
- Emotion From Mouse and Keyboard
- FaDe
- FaceAnalysisInVideogames
- Gestures in Videogames
- Interpretation of facial expressions and movements of the head
- Online Emotion Classification
- Relationship between Cognition and Emotion in Rehabilitation Robotics
- Videogame adaptation
Media coverage
Links to some newspaper articles about our Affective Computing results.