Difference between revisions of "Robocentric MoonSLAM"

From AIRWiki
Jump to: navigation, search
(Created page with "{{ProjectProposal |title=Robocentric implementation in the MoonSLAM framework |image=RobocentricSLAM.jpg |description=Simultaneous Localization and Mapping (SLAM) is one of th...")
(No difference)

Revision as of 16:46, 16 April 2012

Title: Robocentric implementation in the MoonSLAM framework

Image:RobocentricSLAM.jpg

Description: Simultaneous Localization and Mapping (SLAM) is one of the basic functionalities required from an autonomous robot. In the past we have developed a framework for building SLAM algorithm based on the use of the Extended Kalman Filter and vision sensors. The actual implementation of the EKF SLAM in the framework developed uses a world-centric approach, but from the literature it is known that a robocentric approach can provide higher performances on small maps. We would like to have both implementation to compare the results in two scenarios: pure visual odometry, conditional independent submapping.

Material

  • A framework for multisensor SLAM using the world centric approach
  • Papers and report about robocentric slam

Expected outcome:

  • a fully functional robocentric version of the MoonSLAM framework

Required skills or skills to be acquired:

  • Basic background in computer vision
  • Background in Kalman filtering
  • C++ programming under Linux
Tutor: MatteoMatteucci (matteo.matteucci@polimi.it), SimoneCeriani (ceriani@elet.polimi.it)
Start: 2012/04/01
Students: 1 - 2
CFU: 20 - 20
Research Area: Robotics
Research Topic: none
Level: Ms
Type: Thesis
Status: Active