|Short Description:||Simultaneous Localization and Mapping (SLAM) using omnidirectional cameras|
|Research Area:||Computer Vision and Image Analysis|
SLAM (Simultaneous Localization And Mapping) is a technique that allows a robot moving in an unknown environment to create a map of the surrounding place and at the same time to localize itself into this map.
This technique is of great interest because it’s necessary to develop high-level functionalities as the autonomous navigation.
In literature there is a wide set of solutions to the problem of SLAM, depending also from the quantity and type of sensors used.
In the last years has acquired a notable interest the SLAM solution that use a single camera as the only sensor, because it’s a low cost and low power consumption sensor that permits the rapid acquisition of a huge amount of information.
In this category there is the SLAM solution that makes use of an omnidirectional camera.
An omnidirectional camera is a camera with a wide field of view (usually 360°). Currently in the market there are two types of omnidirectional cameras: dioptric and catadioptric.
A dioptric omnidirectional camera, often referred as fish-eye, makes use of particular lens that allows to capture rays in a field of view of near 180°.
A catadioptric omnidirectional camera makes use of an opportunely shaped mirror (hyperbolic, elliptic, parabolic), posed in front of a standard camera (perspective or orthographic).
This second type of camera is of major interest in the robotic community, because despite the loss of the central area of the image (in which the mirror reflects the camera itself), there is valuable increase of the field of view (greater then 180°).
The objective of the work here described is to extend the MoonSLAM project already existent, adding the possibility to use an omnidirectional camera.
The MoonSLAM project (developed by Simone Ceriani) is composed of a set of libraries that allow to develop a SLAM system.