Gestures in Videogames

From AIRWiki
Revision as of 10:32, 4 May 2010 by AndreaBonarini (Talk | contribs) (New page: {{Project |title=Gestures in Videogames |short_descr=Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS) |coordinator=AndreaBonarini |tutor=Simone...)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Gestures in Videogames
Short Description: Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS)
Coordinator: AndreaBonarini (andrea.bonarini@polimi.it)
Tutor: SimoneTognetti (tognetti@elet.polimi.it), MaurizioGarbarino (garbarino@elet.polimi.it)
Collaborator:
Students: LucaPerego (lucagiovanni.perego@gmail.com)
Research Area: Affective Computing
Research Topic: Affective Computing And BioSignals
Start: 2010/04/01
End: 2010/09/30
Status: Active
Level: Ms
Type: Course

This project, belonging to the Affective VideoGames research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience.