Difference between revisions of "Gestures in Videogames"
From AIRWiki
m |
|||
Line 9: | Line 9: | ||
|start=2010/09/10 | |start=2010/09/10 | ||
|end=2011/03/30 | |end=2011/03/30 | ||
− | |status= | + | |status=Closed |
|level=Ms | |level=Ms | ||
|type=Thesis | |type=Thesis | ||
}} | }} | ||
This project, belonging to the [[Affective VideoGames]] research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience. | This project, belonging to the [[Affective VideoGames]] research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience. |
Latest revision as of 16:57, 3 October 2011
Gestures in Videogames
| |
Short Description: | Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS) |
Coordinator: | AndreaBonarini (andrea.bonarini@polimi.it) |
Tutor: | MatteoMatteucci (matteo.matteucci@polimi.it), MaurizioGarbarino (garbarino@elet.polimi.it) |
Collaborator: | |
Students: | GiorgioPrini (giorgio.prini@mail.polimi.it) |
Research Area: | Affective Computing |
Research Topic: | Affective Computing And BioSignals |
Start: | 2010/09/10 |
End: | 2011/03/30 |
Status: | Closed |
Level: | Ms |
Type: | Thesis |
This project, belonging to the Affective VideoGames research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience.