Talk:Robotic Battlefield Control

From AIRWiki
Revision as of 08:59, 6 September 2012 by AndreaSalvi (Talk | contribs) (ArchLinux)

Jump to: navigation, search

Related files

Playfield geometry

The playfield is divided into three parts:

  • two home fields (one for each player), colored in blue for the human player and in red for the autonomous robot. The robots begin a match in these fields, and they are invulnerable while they are inside them; moreover, while being inside of these areas, they will be automatically "repaired" of any damage that they have might suffered during the match. If a robot is severely damaged during fights, it will switch to an "emergency mode" that will force it to head back home for repairs.
  • the midfield, where the control point is also located. The midfield can be cluttered with obstacles of various nature.

Hardware Platforms

Remotely controlled robot

The user-controlled robot will be based on the Spykee platform, which features a reprogrammable Linux-based firmware, and has a frontal camera and a monodirectional microphone as main sensors. Spykee will be controlled through an Android-based program (for smartphones and tablets) which will also function as the setup application for the game itself.

The robot will be also be "dressed" in blue on all sides, in order to be recognizable by the autonomous robot.


Autonomous robot

As for the remotely-controlled robot, the autonomous robot will be colored (in red) in order to be easily recognizeable.

Software Architecture

In order to keep the architecture as light as possible, a subsumption architecture, enhanced with obstacle maps (see below).

Sensors

  • An omni-directional microphone: to sense the vicinity and the probable position of the remotely controlled robot, and to hear the beeper indicating the position of the control point.
  • A frontal camera, which is able to recognize arbitrarily shaped objects (blobs) of a certain color, and returns the coordinates of the vertices of the rectangles that enclose them at no additional cost for the main microcontroller/CPU. This is used:
    • to recognize the "home fields" locations;
    • to have a visual confirmation of the presence of the control point;
    • to identify the remotely controlled robot.
  • Proximity sensors on all sides of the robot, to understand if there are obstacles nearby. To avoid a too coarse identification of obstacles (that would rule out viable passages for the robot), different kind of sensors can be used: for example, ultrasound sonar sensors for coarse detection and infrared sensors for a more fine-grained sensing.

Pending further analysis

  • Odometry: since we have decided to keep an "explorable map" (that is, the map of the explored world, with reasonable metrical errors) in memory, it is imperative to find an economic and simple way to track odometry. It doesn't have to be extremely accurate, but it needs to be accurate enough so that maps don't have to be recalculated from scratch in the middle of a match. Use encoders on wheels?
  • Hardware architecture: Arduino-like or ARM based? ARM based boards (like the RaspberryPi) are more expensive but also more powerful, and can run libraries like OpenCV. In case of non-availability of ARM-based boards it is possible to use nano-ITX or pico-ITX x86 boards.

To be discussed

  • Compass: a compass might be useful for the robot, in order to know its orientation in the world. Is it also possible to add it to Spykee?
  • Mobile robot type: Wheeled with differential drive? Synchronous drive? With tracks?
  • Tactical AI: How to implement it? We want the robot's reactions to be as responsive and believable as possible, since the aim of the game is to entertain the user with a challenge.


Implementation

Operating System

Debian 6.0 "Squeeze"

If we are going to use Debian 6.0 as the "backbone" of our robot, and if we are going to use MRPT (see below) and all its features, then we must compile many libraries by hand, including MRPT itself, VTK, Eigen, Point Cloud Library and their dependencies. It can be quite a chore.

ArchLinux

The distro follows a rolling release model (like Gentoo, but with precompiled and optimized packages), so all system packages are always up to date. This shortens compilation time a lot. Like Debian 6.0, it also supports the RaspberryPi through the ARM port ArchLinux ARM.

Ubuntu 12.04 LTS

Ongoing testing.


"Strong" themes for the thesis

  • "Reachable map" concept
  • Planning movement through the identification of landmarks (the home field and the control point).

Useful and interesting links

Frameworks, Libraries, etc.

  • YARP, Yet Another Robot Platform: lightweight opensource robot platform, compatible with Linux, Windows and Mac, written in C++. It is first and foremost a middleware aimed to create an actual robot software platform, rather than being an integrated, all-inclusive environment per se.
  • URBI: another opensource robot platform, but probably heavy on the CPU. It features a reusable library of modules written in C++, an orchestration/scripting language and compatibility with ROS modules.

Papers

  • MonoSLAM: paper about the single-camera Simultaneous Localization And Mapping algorithm.
  • [2]: notes about implementing a subsumption architecture in plain C