From AIRWiki
Jump to: navigation, search
Short Description: The goal of the project is to develop a drone able to interactive with people like a pet in some aspects.
Coordinator: AndreaBonarini (
Tutor: AndreaBonarini (
Students: ZhizhongLi (
Research Area: Robotics
Research Topic: Robot development
Start: 4/04/2013
End: 14/06/2013
Status: Closed
Level: Bs


The goal of this project is to develop a robotic pet, which means letting the drone behave like a pet in some aspects.


Create a interactive Robot based on AR.Drone.

Useful readings


  • ardrone driver [1]
  • gui controller, drone controller, and drone state estimation [2]
  • Interfacing ROS with OpenCV [3]

Object Detection

  • OpenCV, Feature Detection(SIFT/SURF)[4]
  • OpenCV, Object Detection[5]
  • Predator algorithm, TLD(Tracking-Learning-Detection)[6]

Final result

Based on OpenCV, ROS and TLD algorithm, the AR.drone can detect multi objects and respond with the corresponding behaviors, as a pet. Following it's a detailed description:

Having learnt three objects (shoe, AIRLab trademark, and pizza box),

When the drone sees the shoe, it will take off.

When the drone sees the trademark of AIRLab it will dance.

When the drone sees the pizza box it will land.

A video is available from YouTube.