Graphical user interface for an autonomous wheelchair

From AIRWiki
Revision as of 17:24, 13 October 2008 by BernardoDalSeno (Talk | contribs) ('''Part 1: project profile''')

Jump to: navigation, search

Part 1: project profile

Project name

Graphical user interface for an autonomous wheelchair

Project short description

This project is aimed at design and developing a graphic interface that allows the user on a weelchair (using the BCI system) to express a thought given a table with the letters of alphabet or to express a place to reach given a map.


Start date:

End date:

People involved

Project head(s)

M. Matteucci User:MatteoMatteucci

Other Politecnico di Milano people

B. Dal Seno User:BernardoDalSeno

Students currently working on the project

R. Massimini User:RobertoMassimini

Part 2: project description

The interface is used mainly to drive the Lurch wheelchair. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).

The interface should be as accessible as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.

Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.