The emotional gesture is implemented both for the movement on the floor and for the cover of the bin. It is based on: speed, accelleration, rythm, direction (modality). The gesture is intrinsic for the character (constant mood), and special gesture are triggered by events that activate emotions (e.g., happines comes from the fact that somebebody puts something in the bin).
>Distance sensors: IR or sonar covering some of the front/lateral part of the robot
>Pixy camera: detects dimension and position of color blobs
>Sensor for trash (switch or IR on the mouth)
Triskarino with R2P as motor control and Ardnino as "brain".
To Do list
>Design behaviors => Identify behaviors, and how to implement them. State machine.
No delay() but millis() To manage accelleration: with servos use slowSpeed(), with DC motors speed control