A new approach to UAV control, combining computer vision and control techniques, is introduced. Objects are detected using a UAV camera. The distance between the object center of mass, in the image plane, and the image center is measured. This distance is used as an error signal to control the UAV velocity. The feedback loop is implemented with a proportional-integral-derivative (PID) controller. The computer vision techniques used for object detection are inspired by the attention mechanisms of human vision, exploiting a combination of bottom-up and top-down saliency cues to speed up recognition. The saliency mechanisms are implemented with a combination of the Harris interest point detector and an object detector cascade. They enable recognition at video frame rates. Although computer vision has previously been used in UAV literature, the emphasis has been on motion computations (optical flow) for navigation and obstacle avoidance. The techniques now introduced enable the UAV to recognize specific objects and react to them. In particular, the UAV can track a person wearing an object, such as a piece of clothing or patch. Unlike current GPS-based methods, the behavior of the UAV can vary according to the object being tracked. UAV programming thus becomes as simple as changing your clothes (or pattern). The UAV is also much more precise in its interaction with the user and works both in and outdoors. Many applications could follow from this technology, including a UAV that behaves as an older brother that watches a child from above, a personal cameraman that follows an athlete on the field, an additional pair of eyes that allow a bike rider to “look around the corner,'' or a sitter that watches a pet as it roams around the park.
European Organization for Nuclear Research-CERN: Third Award $500
Fourth Award of $500