https://airwiki.elet.polimi.it/api.php?action=feedcontributions&user=RobertoMassimini&feedformat=atomAIRWiki - User contributions [en]2024-03-28T13:16:08ZUser contributionsMediaWiki 1.25.6https://airwiki.elet.polimi.it/index.php?title=Graphical_user_interface_for_an_autonomous_wheelchair&diff=3391Graphical user interface for an autonomous wheelchair2008-06-10T21:41:29Z<p>RobertoMassimini: </p>
<hr />
<div>== '''Part 1: project profile''' ==<br />
=== Project name ===<br />
Graphical user interface for an autonomous wheelchair<br />
<br />
=== Project short description ===<br />
<br />
This project is aimed at design and developing a graphic interface that allows the user on a weelchair (using the BCI system) to express a thought given a table with the letters of alphabet or to express a place to reach given a map.<br />
<br />
=== Dates ===<br />
Start date: <br />
<br />
End date: <br />
<br />
=== People involved ===<br />
<br />
<br />
===== Project head(s) =====<br />
M. Matteucci [[MatteoMatteucci]]<br />
<br />
<br />
===== Other Politecnico di Milano people =====<br />
B. Dal Seno [[BernardoDalSeno]]<br />
<br />
===== Students currently working on the project =====<br />
R. Massimini [[RobertoMassimini]]<br />
<br />
<br />
== '''Part 2: project description''' ==<br />
The interface is used mainly to drive the Lurch wheelchair. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as accessible as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=File:IdExample.jpg&diff=3285File:IdExample.jpg2008-05-30T08:46:16Z<p>RobertoMassimini: </p>
<hr />
<div></div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Talk:Graphical_user_interface_for_an_autonomous_wheelchair&diff=3284Talk:Graphical user interface for an autonomous wheelchair2008-05-30T08:44:37Z<p>RobertoMassimini: /* Comments about the diagram */</p>
<hr />
<div>==General==<br />
<br />
The interface is used mainly to drive the [[LURCH - The autonomous wheelchair|Lurch wheelchair]]. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as [http://en.wikipedia.org/wiki/Accessibility accessible] as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.<br />
<br />
===To Do===<br />
<br />
* Feasibility check: [http://www.wxwidgets.org/ wxWidgets] and [http://maiaproject.sourceforge.net/ OpenMaia]. Check if using OpenMaia with wxWidgets is okay for the stimulation synchronization. The timing difference between the highlighting of a choice and the switching of the synchronization square should be no more than the duration of a screen frame (about 16 ms for a 60 Hz screen).<br />
<br />
* Writing of a communication protocol between the BCI2000 developed in the project [[Online P300 and ErrP recognition with BCI2000]]. To be done together with [[User:AndreaSgarlata|Andrea Sgarlata]].<br />
<br />
* Development of a storage system to store information about the layout and the action of the interfaces; an extension of the XML format used by OpenMaia to describe keyboards should be possible.<br />
<br />
==Communication Protocol Requirements==<br />
<br />
* Cross-platform (at least Linux and Windows)<br />
* Based on the IP protocol<br />
* Asynchronous (as much as possible), so as to not block remote processes<br />
* Preferably, the protocol should in ASCII, with fixed-width fields (the number of fields is variable, by necessity).<br />
<br />
===To Do===<br />
<br />
* List of the pieces of information to be transferred between the application and the GUI<br />
<br />
==Communication Protocol==<br />
===UML Sequence Diagram===<br />
====Diagram====<br />
<br />
[[Image:ComunicationProtocol.jpg]]<br />
<br />
====Comments about the diagram====<br />
Structure of MessageA:<br />
* Number of repetitions: number of flashings<br />
* Number of stimulations: number of flashings in one repetition<br />
* List of the stimulations: list of the possible stimulations with its type<br />
* Number of types<br />
* Stimulations sequence: it includes all the repetitions<br />
<br />
Each stimulation must hav associated a type (e.g Icon, Row-Column)<br />
<br />
Structure of SelectionA:<br />
* For each number of stimulation types you have a equal number of ids (e.g for Icon it will be used 1 id, for Row-Column 2 ids)<br />
<br />
Graphic example of id:<br />
<br />
[[Image:IdExample.jpg]]<br />
<br />
It will be also an end asynchronous message, that brings the BCI in pause, and it closes the graphic interface.<br />
<br />
If the syncrony will be lost there's a Reset Message that restarts from the calibration. It will be activated when the number of the classifications is different from the number of the stimulations on the screen</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Talk:Graphical_user_interface_for_an_autonomous_wheelchair&diff=3283Talk:Graphical user interface for an autonomous wheelchair2008-05-30T08:24:15Z<p>RobertoMassimini: </p>
<hr />
<div>==General==<br />
<br />
The interface is used mainly to drive the [[LURCH - The autonomous wheelchair|Lurch wheelchair]]. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as [http://en.wikipedia.org/wiki/Accessibility accessible] as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.<br />
<br />
===To Do===<br />
<br />
* Feasibility check: [http://www.wxwidgets.org/ wxWidgets] and [http://maiaproject.sourceforge.net/ OpenMaia]. Check if using OpenMaia with wxWidgets is okay for the stimulation synchronization. The timing difference between the highlighting of a choice and the switching of the synchronization square should be no more than the duration of a screen frame (about 16 ms for a 60 Hz screen).<br />
<br />
* Writing of a communication protocol between the BCI2000 developed in the project [[Online P300 and ErrP recognition with BCI2000]]. To be done together with [[User:AndreaSgarlata|Andrea Sgarlata]].<br />
<br />
* Development of a storage system to store information about the layout and the action of the interfaces; an extension of the XML format used by OpenMaia to describe keyboards should be possible.<br />
<br />
==Communication Protocol Requirements==<br />
<br />
* Cross-platform (at least Linux and Windows)<br />
* Based on the IP protocol<br />
* Asynchronous (as much as possible), so as to not block remote processes<br />
* Preferably, the protocol should in ASCII, with fixed-width fields (the number of fields is variable, by necessity).<br />
<br />
===To Do===<br />
<br />
* List of the pieces of information to be transferred between the application and the GUI<br />
<br />
==Communication Protocol==<br />
===UML Sequence Diagram===<br />
====Diagram====<br />
<br />
[[Image:ComunicationProtocol.jpg]]<br />
<br />
====Comments about the diagram====<br />
Structure of MessageA:<br />
* Number of repetitions: number of flashings<br />
* Number of stimulations: number of flashings in one repetition<br />
* List of the stimulations: list of the possible stimulations with its type<br />
* Number of types<br />
* Stimulations sequence: it includes all the repetitions<br />
<br />
Each stimulation must hav associated a type (e.g Icon, Row-Column)<br />
<br />
Structure of SelectionA:<br />
* For each number of stimulation types you have a equal number of ids (e.g for Icon it will be used 1 id, for Row-Column 2 ids)<br />
<br />
It will be also an end asynchronous message, that brings the BCI in pause, and it closes the graphic interface.<br />
<br />
If the syncrony will be lost there's a Reset Message that restarts from the calibration. It will be activated when the number of the classifications is different from the number of the stimulations on the screen</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Talk:Graphical_user_interface_for_an_autonomous_wheelchair&diff=3282Talk:Graphical user interface for an autonomous wheelchair2008-05-30T08:19:15Z<p>RobertoMassimini: </p>
<hr />
<div>==General==<br />
<br />
The interface is used mainly to drive the [[LURCH - The autonomous wheelchair|Lurch wheelchair]]. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as [http://en.wikipedia.org/wiki/Accessibility accessible] as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.<br />
<br />
===To Do===<br />
<br />
* Feasibility check: [http://www.wxwidgets.org/ wxWidgets] and [http://maiaproject.sourceforge.net/ OpenMaia]. Check if using OpenMaia with wxWidgets is okay for the stimulation synchronization. The timing difference between the highlighting of a choice and the switching of the synchronization square should be no more than the duration of a screen frame (about 16 ms for a 60 Hz screen).<br />
<br />
* Writing of a communication protocol between the BCI2000 developed in the project [[Online P300 and ErrP recognition with BCI2000]]. To be done together with [[User:AndreaSgarlata|Andrea Sgarlata]].<br />
<br />
* Development of a storage system to store information about the layout and the action of the interfaces; an extension of the XML format used by OpenMaia to describe keyboards should be possible.<br />
<br />
==Communication Protocol Requirements==<br />
<br />
* Cross-platform (at least Linux and Windows)<br />
* Based on the IP protocol<br />
* Asynchronous (as much as possible), so as to not block remote processes<br />
* Preferably, the protocol should in ASCII, with fixed-width fields (the number of fields is variable, by necessity).<br />
<br />
===To Do===<br />
<br />
* List of the pieces of information to be transferred between the application and the GUI<br />
<br />
==Communication Protocol==<br />
===UML Sequence Diagram===<br />
====Diagram====<br />
<br />
[[Image:ComunicationProtocol.jpg]]<br />
<br />
====Comments about the diagram====<br />
Structure of MessageA:<br />
* Number of repetitions: number of flashings<br />
* Number of stimulations: number of flashings in one repetition<br />
* List of the stimulations: list of the possible stimulations with its type<br />
* Number of types<br />
* Stimulations sequence: it includes all the repetitions<br />
<br />
Each stimulation must hav associated a type (e.g Icon, Row-Column)<br />
<br />
Structure of SelectionA:<br />
* For each number of stimulation types you have a equal number of ids (e.g for Icon it will be used 1 id, for Row-Column 2 ids)</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Talk:Graphical_user_interface_for_an_autonomous_wheelchair&diff=3281Talk:Graphical user interface for an autonomous wheelchair2008-05-30T08:09:11Z<p>RobertoMassimini: </p>
<hr />
<div>==General==<br />
<br />
The interface is used mainly to drive the [[LURCH - The autonomous wheelchair|Lurch wheelchair]]. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as [http://en.wikipedia.org/wiki/Accessibility accessible] as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.<br />
<br />
===To Do===<br />
<br />
* Feasibility check: [http://www.wxwidgets.org/ wxWidgets] and [http://maiaproject.sourceforge.net/ OpenMaia]. Check if using OpenMaia with wxWidgets is okay for the stimulation synchronization. The timing difference between the highlighting of a choice and the switching of the synchronization square should be no more than the duration of a screen frame (about 16 ms for a 60 Hz screen).<br />
<br />
* Writing of a communication protocol between the BCI2000 developed in the project [[Online P300 and ErrP recognition with BCI2000]]. To be done together with [[User:AndreaSgarlata|Andrea Sgarlata]].<br />
<br />
* Development of a storage system to store information about the layout and the action of the interfaces; an extension of the XML format used by OpenMaia to describe keyboards should be possible.<br />
<br />
==Communication Protocol Requirements==<br />
<br />
* Cross-platform (at least Linux and Windows)<br />
* Based on the IP protocol<br />
* Asynchronous (as much as possible), so as to not block remote processes<br />
* Preferably, the protocol should in ASCII, with fixed-width fields (the number of fields is variable, by necessity).<br />
<br />
===To Do===<br />
<br />
* List of the pieces of information to be transferred between the application and the GUI<br />
<br />
==Communication Protocol==<br />
===UML Sequence Diagram===<br />
====Diagram====<br />
<br />
[[Image:ComunicationProtocol.jpg]]<br />
<br />
====Comments about the diagram====<br />
Structure of MessageA:<br />
* Number of repetitions: number of flashings<br />
* Number of stimulations: number of flashings in one repetition<br />
* List of the stimulations: list of the possible stimulations with its type<br />
* Number of types<br />
* Stimulations sequence: it includes all the repetitions<br />
<br />
Each stimulation must hav associated a type (e.g Icon, Row-Column)<br />
<br />
Structure of SelectionA:</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Talk:Graphical_user_interface_for_an_autonomous_wheelchair&diff=3280Talk:Graphical user interface for an autonomous wheelchair2008-05-30T08:06:15Z<p>RobertoMassimini: </p>
<hr />
<div>==General==<br />
<br />
The interface is used mainly to drive the [[LURCH - The autonomous wheelchair|Lurch wheelchair]]. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as [http://en.wikipedia.org/wiki/Accessibility accessible] as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.<br />
<br />
===To Do===<br />
<br />
* Feasibility check: [http://www.wxwidgets.org/ wxWidgets] and [http://maiaproject.sourceforge.net/ OpenMaia]. Check if using OpenMaia with wxWidgets is okay for the stimulation synchronization. The timing difference between the highlighting of a choice and the switching of the synchronization square should be no more than the duration of a screen frame (about 16 ms for a 60 Hz screen).<br />
<br />
* Writing of a communication protocol between the BCI2000 developed in the project [[Online P300 and ErrP recognition with BCI2000]]. To be done together with [[User:AndreaSgarlata|Andrea Sgarlata]].<br />
<br />
* Development of a storage system to store information about the layout and the action of the interfaces; an extension of the XML format used by OpenMaia to describe keyboards should be possible.<br />
<br />
==Communication Protocol Requirements==<br />
<br />
* Cross-platform (at least Linux and Windows)<br />
* Based on the IP protocol<br />
* Asynchronous (as much as possible), so as to not block remote processes<br />
* Preferably, the protocol should in ASCII, with fixed-width fields (the number of fields is variable, by necessity).<br />
<br />
===To Do===<br />
<br />
* List of the pieces of information to be transferred between the application and the GUI<br />
<br />
==Communication Protocol==<br />
===UML Sequence Diagram===<br />
====Diagram====<br />
<br />
[[Image:ComunicationProtocol.jpg]]<br />
<br />
====Comments about the diagram====<br />
Structure of MessageA:<br />
. Number of repetitions: number of flashings<br />
. Number of stimulations: number of flashings in one repetition<br />
. List of the stimulations: list of the possible stimulations with its type<br />
. Number of types<br />
. Stimulations sequence: it includes all the repetitions</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Talk:Graphical_user_interface_for_an_autonomous_wheelchair&diff=3279Talk:Graphical user interface for an autonomous wheelchair2008-05-30T08:05:28Z<p>RobertoMassimini: </p>
<hr />
<div>==General==<br />
<br />
The interface is used mainly to drive the [[LURCH - The autonomous wheelchair|Lurch wheelchair]]. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as [http://en.wikipedia.org/wiki/Accessibility accessible] as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.<br />
<br />
===To Do===<br />
<br />
* Feasibility check: [http://www.wxwidgets.org/ wxWidgets] and [http://maiaproject.sourceforge.net/ OpenMaia]. Check if using OpenMaia with wxWidgets is okay for the stimulation synchronization. The timing difference between the highlighting of a choice and the switching of the synchronization square should be no more than the duration of a screen frame (about 16 ms for a 60 Hz screen).<br />
<br />
* Writing of a communication protocol between the BCI2000 developed in the project [[Online P300 and ErrP recognition with BCI2000]]. To be done together with [[User:AndreaSgarlata|Andrea Sgarlata]].<br />
<br />
* Development of a storage system to store information about the layout and the action of the interfaces; an extension of the XML format used by OpenMaia to describe keyboards should be possible.<br />
<br />
==Communication Protocol Requirements==<br />
<br />
* Cross-platform (at least Linux and Windows)<br />
* Based on the IP protocol<br />
* Asynchronous (as much as possible), so as to not block remote processes<br />
* Preferably, the protocol should in ASCII, with fixed-width fields (the number of fields is variable, by necessity).<br />
<br />
===To Do===<br />
<br />
* List of the pieces of information to be transferred between the application and the GUI<br />
<br />
==Communication Protocol==<br />
===UML Sequence Diagram===<br />
<br />
[[Image:ComunicationProtocol.jpg]]<br />
<br />
=Comments about the diagram:=<br />
Structure of MessageA:<br />
. Number of repetitions: number of flashings<br />
. Number of stimulations: number of flashings in one repetition<br />
. List of the stimulations: list of the possible stimulations with its type<br />
. Number of types<br />
. Stimulations sequence: it includes all the repetitions</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Talk:Graphical_user_interface_for_an_autonomous_wheelchair&diff=3277Talk:Graphical user interface for an autonomous wheelchair2008-05-29T19:25:09Z<p>RobertoMassimini: </p>
<hr />
<div>==General==<br />
<br />
The interface is used mainly to drive the [[LURCH - The autonomous wheelchair|Lurch wheelchair]]. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as [http://en.wikipedia.org/wiki/Accessibility accessible] as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.<br />
<br />
===To Do===<br />
<br />
* Feasibility check: [http://www.wxwidgets.org/ wxWidgets] and [http://maiaproject.sourceforge.net/ OpenMaia]. Check if using OpenMaia with wxWidgets is okay for the stimulation synchronization. The timing difference between the highlighting of a choice and the switching of the synchronization square should be no more than the duration of a screen frame (about 16 ms for a 60 Hz screen).<br />
<br />
* Writing of a communication protocol between the BCI2000 developed in the project [[Online P300 and ErrP recognition with BCI2000]]. To be done together with [[User:AndreaSgarlata|Andrea Sgarlata]].<br />
<br />
* Development of a storage system to store information about the layout and the action of the interfaces; an extension of the XML format used by OpenMaia to describe keyboards should be possible.<br />
<br />
==Communication Protocol Requirements==<br />
<br />
* Cross-platform (at least Linux and Windows)<br />
* Based on the IP protocol<br />
* Asynchronous (as much as possible), so as to not block remote processes<br />
* Preferably, the protocol should in ASCII, with fixed-width fields (the number of fields is variable, by necessity).<br />
<br />
===To Do===<br />
<br />
* List of the pieces of information to be transferred between the application and the GUI<br />
<br />
==Communication Protocol==<br />
===UML Sequence Diagram===<br />
<br />
[[Image:ComunicationProtocol.jpg]]</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=File:ComunicationProtocol.jpg&diff=3276File:ComunicationProtocol.jpg2008-05-29T19:14:54Z<p>RobertoMassimini: </p>
<hr />
<div></div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Talk:Graphical_user_interface_for_an_autonomous_wheelchair&diff=3275Talk:Graphical user interface for an autonomous wheelchair2008-05-29T19:13:53Z<p>RobertoMassimini: </p>
<hr />
<div>==General==<br />
<br />
The interface is used mainly to drive the [[LURCH - The autonomous wheelchair|Lurch wheelchair]]. It has different screens, corresponding to the different rooms or different environments where the wheelchair can move. The screens are organized hierarchically: a main screen permits to choose whether to drive the wheelchair or perform other tasks. The screens used for driving the wheelchair show a map of the environment, with different levels of details; e.g., a map might show just the rooms of a house, and then another map might show the living room with the furniture and ‘interesting’ positions (like near the table, in front of the TV, beside the window...).<br />
<br />
The interface should be as [http://en.wikipedia.org/wiki/Accessibility accessible] as possible. It can be driven by a BCI, used with the touch screen or with the keyboard. The BCI is based on the P300 and ErrP potentials; so the interface should highlight the possible choices on at a time (in orthogonal groups if the choices are numerous), and show the choice detected by the BCI for ErrP-based confirmation.<br />
<br />
Maybe the screen should be updated while the wheelchair navigates; e.g., when the wheelchair enter into the kitchen, the GUI shows the map of the kitchen.<br />
<br />
===To Do===<br />
<br />
* Feasibility check: [http://www.wxwidgets.org/ wxWidgets] and [http://maiaproject.sourceforge.net/ OpenMaia]. Check if using OpenMaia with wxWidgets is okay for the stimulation synchronization. The timing difference between the highlighting of a choice and the switching of the synchronization square should be no more than the duration of a screen frame (about 16 ms for a 60 Hz screen).<br />
<br />
* Writing of a communication protocol between the BCI2000 developed in the project [[Online P300 and ErrP recognition with BCI2000]]. To be done together with [[User:AndreaSgarlata|Andrea Sgarlata]].<br />
<br />
* Development of a storage system to store information about the layout and the action of the interfaces; an extension of the XML format used by OpenMaia to describe keyboards should be possible.<br />
<br />
==Communication Protocol Requirements==<br />
<br />
* Cross-platform (at least Linux and Windows)<br />
* Based on the IP protocol<br />
* Asynchronous (as much as possible), so as to not block remote processes<br />
* Preferably, the protocol should in ASCII, with fixed-width fields (the number of fields is variable, by necessity).<br />
<br />
===To Do===<br />
<br />
* List of the pieces of information to be transferred between the application and the GUI<br />
<br />
==Communication Protocol==<br />
<br />
To do...<br />
[[Image:ComunicationProtocol.jpg]]</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Graphical_user_interface_for_an_autonomous_wheelchair&diff=2938Graphical user interface for an autonomous wheelchair2008-05-14T21:28:01Z<p>RobertoMassimini: </p>
<hr />
<div>== '''Part 1: project profile''' ==<br />
=== Project name ===<br />
Graphical user interface for an autonomous wheelchair<br />
<br />
=== Project short description ===<br />
<br />
This project is aimed at design and developing a graphic interface that allows the user on a weelchair (using the BCI system) to express a thought given a table with the letters of alphabet or to express a place to reach given a map.<br />
<br />
=== Dates ===<br />
Start date: <br />
<br />
End date: <br />
<br />
=== People involved ===<br />
<br />
<br />
===== Project head(s) =====<br />
M. Matteucci [[MatteoMatteucci]]<br />
<br />
<br />
===== Other Politecnico di Milano people =====<br />
B. Del Seno [[BernardoDalSeno]]<br />
<br />
===== Students currently working on the project =====<br />
R. Massimini [[RobertoMassimini]]<br />
<br />
<br />
== '''Part 2: project description''' ==</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Graphical_user_interface_for_an_autonomous_wheelchair&diff=2937Graphical user interface for an autonomous wheelchair2008-05-14T21:26:38Z<p>RobertoMassimini: </p>
<hr />
<div>== '''Part 1: project profile''' ==<br />
=== Project name ===<br />
Graphical user interface for an autonomous wheelchair<br />
<br />
=== Project short description ===<br />
<br />
This project is aimed at design and developing a graphic interface that allows the user on the weelchair (using the BCI system) to express a thought given a table with the letter of alphabet<br />
<br />
=== Dates ===<br />
Start date: <br />
<br />
End date: <br />
<br />
=== People involved ===<br />
<br />
<br />
===== Project head(s) =====<br />
M. Matteucci [[MatteoMatteucci]]<br />
<br />
<br />
===== Other Politecnico di Milano people =====<br />
B. Del Seno [[BernardoDalSeno]]<br />
<br />
===== Students currently working on the project =====<br />
R. Massimini [[RobertoMassimini]]<br />
<br />
<br />
== '''Part 2: project description''' ==</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Graphical_user_interface_for_an_autonomous_wheelchair&diff=2936Graphical user interface for an autonomous wheelchair2008-05-14T21:19:05Z<p>RobertoMassimini: </p>
<hr />
<div>== '''Part 1: project profile''' ==<br />
=== Project name ===<br />
Graphical user interface for an autonomous wheelchair<br />
<br />
=== Project short description ===<br />
<br />
Questo progetto ha l'obiettivo di progettare e costruire un'interfaccia grafica che permetta, attraverso il istema BCI, al disabile di poter interagire <br />
con il computer.<br />
<br />
=== Dates ===<br />
Start date: <br />
<br />
End date: <br />
<br />
=== People involved ===<br />
<br />
<br />
===== Project head(s) =====<br />
M. Matteucci [[User:MatteoMatteucci]]<br />
<br />
<br />
===== Other Politecnico di Milano people =====<br />
B. Del Seno [[BernardoDalSeno]]<br />
<br />
===== Students currently working on the project =====<br />
R. Massimini [[RobertoMassimini]]<br />
<br />
<br />
== '''Part 2: project description''' ==</div>RobertoMassiminihttps://airwiki.elet.polimi.it/index.php?title=Projects&diff=2935Projects2008-05-14T21:17:08Z<p>RobertoMassimini: </p>
<hr />
<div>''This page is a repository of links to the pages describing the '''projects''' we are currently working on at AIRLab. <br />
See the list of our finished projects on the [[Finished Projects]] page.''<br />
<br />
== Ongoing projects ==<br />
''by research area (areas are defined in the [[Main Page]]); for each project a name and a link to its AIRWiki page is given''<br />
<br />
==== [[Agents, Multiagent Systems, Agencies]] ====<br />
----<br />
<br />
* [[Multiagent cooperation|Multiagent cooperating system]]<br />
<br />
* [[Planning in Ambient Intelligence scenarios| Planning in Ambient Intelligence scenarios]]<br />
<br />
==== [[BioSignal Analysis]] ====<br />
----<br />
====== [[Affective Computing]] ======<br />
<br />
* [[Relatioship between Cognition and Emotion in Rehabilitation Robotics]]<br />
* [[Driving companions]]<br />
* [[Emotion from Interaction]]<br />
* [[Affective Devices]]<br />
<br />
====== [[Brain Computer Interface]] ======<br />
<br />
* [[Command wheelchair using BCI2000]]<br />
* [[BCI based on Motor Imagery]]<br />
* [[Graphical user interface for an autonomous wheelchair]]<br />
<br />
====== [[Automatic Detection Of Sleep Stages]] ======<br />
<br />
* [[Sleep Staging with HMM]]<br />
<br />
====== [[Analysis of the Olfactory Signal]] ======<br />
<br />
* [[Lung Cancer Detection by an Electronic Nose]]<br />
* [[HE-KNOWS - An electronic nose]]<br />
<br />
==== [[Computer Vision and Image Analysis]] ====<br />
----<br />
<br />
* [[Automated extraction of laser streaks and range profiles]]<br />
<br />
* [[Data collection for mutual calibration|Data collection for laser-rangefinder and camera calibration]]<br />
<br />
* [[Particle filter for object tracking]]<br />
<br />
* [[Wii Remote headtracking and active projector]]<br />
<br />
==== [[Machine Learning]] ====<br />
----<br />
* [[Adaptive Reinforcement Learning Multiagent Coordination in Real-Time Computer Games|Adaptive Reinforcement Learning Multiagent Coordination in Real-Time Computer Games]]<br />
<br />
==== [[Ontologies and Semantic Web]] ====<br />
----<br />
* [[JOFS|JOFS, Java Owl File Storage]]<br />
* [[FolksOnt|FolksOnt]]<br />
* [[Extending a wiki with semantic templates]]<br />
* [[GeoOntology|Geographic ontology for a semantic wiki]]<br />
<br />
==== [[Philosophy of Artificial Intelligence]] ====<br />
----<br />
==== [[Robotics]] ====<br />
----<br />
* [[LURCH - The autonomous wheelchair]]<br />
<br />
* [[Rawseeds|RAWSEEDS]]<br />
<br />
* [[Balancing robots: Tilty, TiltOne]]<br />
<br />
* [[ROBOWII ]]<br />
<br />
* [[PoliManus]]<br />
<br />
==== [[Soft Computing]] ====<br />
----<br />
<br />
<br />
<br />
== Note for students ==<br />
<br />
If you are a student and there isn't a '''page describing your project''', this is because YOU have the task of creating it and populating it with (meaningful) content. If you are a student and there IS a page describing your project, you have the task to complete that page with (useful and comprehensive) information about your own contribution to the project. Be aware that the quality of your work (or lack of it) on the AIRWiki will be evaluated by the Teachers and will influence your grades.<br />
<br />
Instructions to add a new project or to add content to an existing project page are available at [[Projects - HOWTO]].</div>RobertoMassimini