Difference between revisions of "Robocentric MoonSLAM"

From AIRWiki
Jump to: navigation, search
 
Line 26: Line 26:
 
|level=Ms
 
|level=Ms
 
|type=Thesis
 
|type=Thesis
|status=Active
+
|status=Closed
 
}}
 
}}

Latest revision as of 00:42, 22 December 2014

Title: Robocentric implementation in the MoonSLAM framework
RobocentricSLAM.gif

Image:RobocentricSLAM.gif

Description: Simultaneous Localization and Mapping (SLAM) is one of the basic functionalities required from an autonomous robot. In the past we have developed a framework for building SLAM algorithm based on the use of the Extended Kalman Filter and vision sensors. The actual implementation of the EKF SLAM in the framework developed uses a world-centric approach, but from the literature it is known that a robocentric approach can provide higher performances on small maps. We would like to have both implementation to compare the results in two scenarios: pure visual odometry, conditional independent submapping.

Material

  • A framework for multisensor SLAM using the world centric approach
  • Papers and report about robocentric slam

Expected outcome:

  • a fully functional robocentric version of the MoonSLAM framework

Required skills or skills to be acquired:

  • Basic background in computer vision
  • Background in Kalman filtering
  • C++ programming under Linux
Tutor: MatteoMatteucci (matteo.matteucci@polimi.it), SimoneCeriani (ceriani@elet.polimi.it)
Start: 2012/04/01
Students: 1 - 2
CFU: 20 - 20
Research Area: Robotics
Research Topic: none
Level: Ms
Type: Thesis
Status: Closed