Difference between revisions of "Emotion from Interaction"

From AIRWiki
Jump to: navigation, search
m
(Part 2: project description)
Line 43: Line 43:
  
 
[[User:GoranBurek|Goran Burek]] will define a protocol to identify specific emotional states (e.g., attention) and collect and analyze data from different users, comparing them with biological data acquired during the interaction.
 
[[User:GoranBurek|Goran Burek]] will define a protocol to identify specific emotional states (e.g., attention) and collect and analyze data from different users, comparing them with biological data acquired during the interaction.
 +
 +
Until the April 2009 the following parts of project have been established:
 +
In order to obtain biological data Goran has developed Affective Logger in C++ using developer API for the ProComp Infinity, the device in charge of collecting bio-signals (such as hart rate, skin conductance, body temperature etc.)
 +
 +
In January 2009 the first testing model has been abandoned due to lack of guarantee that user doing the test will find himself in the appropriate emotional state.
 +
The model was imagined to test the user for levels of concentration, stress and tiredness. To test each of three emotional states there was a phase of inducing the given state (e.g. playing annoying music to induce stress) and a phase of testing where user would do some simple cognitive tasks. The tasks were the same for every emotional state. Goal was to compare the data and performance between users and their states.
 +
But the model was abandoned because of rapid changes of emotional state and uncertainty of the duration of each emotional state.
  
 
[[User:MauroPellicioli|Mauro Pellicioli]] will work on specific emotional states (stress, boreness) induced by adopting sounds and images from He will collect and analyze data from different users, eventually comparing them with biological data acquired during the interaction.
 
[[User:MauroPellicioli|Mauro Pellicioli]] will work on specific emotional states (stress, boreness) induced by adopting sounds and images from He will collect and analyze data from different users, eventually comparing them with biological data acquired during the interaction.

Revision as of 10:19, 23 April 2009

Part 1: project profile

Project name

Emotion from Interaction

Project short description

The project is aimed at studying the relationship between emotion and cognition in a human-computer system. Since emotion influences the behavior, we believe that from the analysis of the interaction with the computer it would be possible to discriminate the user emotional state.

Dates

Start date: 2008/04/08

End date: 2009/12/20

Internet site(s)

People involved

Project leaders

Other Politecnico di Milano people

Students

Students that have worked on the project

Laboratory work and risk analysis

Laboratory work for this project will be mainly performed at AIRLab/Lambrate. It will include electrical and electronic activity. Potentially risky activities are the following:

  • Use of soldering iron. Standard safety measures described in Safety norms will be followed.

Part 2: project description

The project is aimed at studying the relationship between emotion and cognition in a human-computer system. Since emotion influences the behavior, we believe that from the analysis of the interaction with the computer it would be possible to discriminate the user emotional state.

Giorgio Luparia is working on the FaDe project.

Goran Burek will define a protocol to identify specific emotional states (e.g., attention) and collect and analyze data from different users, comparing them with biological data acquired during the interaction.

Until the April 2009 the following parts of project have been established: In order to obtain biological data Goran has developed Affective Logger in C++ using developer API for the ProComp Infinity, the device in charge of collecting bio-signals (such as hart rate, skin conductance, body temperature etc.)

In January 2009 the first testing model has been abandoned due to lack of guarantee that user doing the test will find himself in the appropriate emotional state. The model was imagined to test the user for levels of concentration, stress and tiredness. To test each of three emotional states there was a phase of inducing the given state (e.g. playing annoying music to induce stress) and a phase of testing where user would do some simple cognitive tasks. The tasks were the same for every emotional state. Goal was to compare the data and performance between users and their states. But the model was abandoned because of rapid changes of emotional state and uncertainty of the duration of each emotional state.

Mauro Pellicioli will work on specific emotional states (stress, boreness) induced by adopting sounds and images from He will collect and analyze data from different users, eventually comparing them with biological data acquired during the interaction.