Autonomous Drone Solution

Autonomous Drone Solution

The Red Balloon Finder project was written to enable a Raspberry or Intel Aero to control an Copter based quadcopter to follow and pop 1m red balloon. The python code that runs on the board can be found on the section of Autonomous of the git.

Object detection theory

In vision based systems, there are many types of hardware/software configuration tailored for specific applications: Visual Servoing, Visual Odometry and Visual Simultaneous Localization And Mapping (SLAM). In this project we are using the former type of system: Visual Servoing that is designed to:

  • Take off and Landing

  • Obstacle Avoidance/Tracking

  • Position and Attitude control

  • Stabilization over a target

The main idea of Visual Servoing is to regulate the pose {Cξ,T } (position and orientation) of a robotic platform relative to a target, using a set of visual features {f } extracted from the sensors.

Communication

The Pixhawk has all the regular ArduCopter functionality but in addition it accepts commands from an external navigation computer, an board when it is in the Guided flight mode, or in AUTO and running a newly created MAVLink NAV_GUIDED mission command. This NAV_GUIDED command takes some special arguments like a timeout (in seconds), a min and max altitude, and a maximum distance and if the Odroid takes too long to find the balloon, or gives crazy commands that cause the copter to fly away, the Pixhawk retakes control.

So while the board is in control, it first requests the pixhawk to rotate the copter slowly around while it pulls images from the integrated cam and uses OpenCV to search for blobs of red in the images. During the search it keeps track of the largest red blob it saw so after the copter has rotate 360 degrees it sends commands to point the vehicle in the direction of that blob and then sends 3D velocity requests 5 times per second to guide the copter towards the top of the balloon.

MultiProcessing

Python's MultiProcessing to allow the image capture and image processing to run in separate processes (so they run in parallel).

Start Project

Simulation

To start the project we need to :

$ cd Xiaomin/Scripts/Autonomous/simulator
$ ./start.sh

Drone

$ workon cv
$ cd Xiaomin/Scripts/Autonomous/scripts/
$ python ballon_video.py

Code Structure

Code

Functions

attitude_history.py

Provides delayed attitude and location information.

balloon_config.py

Handles config for the ballon_finder project.

balloon_strategy.py

The main script that initialize all the classes.

balloon_utils.py

Utility functions for the balloon_finder project

balloon_video.py

Initialise the camera, and the output video

colour_finder.py

Helps find the min and max Hue, Saturation and Brightness of a desired object

find_balloon.py

Find the pixel location of red balloons.

linux_run_strategy.sh

Initialice the drone on arducopter

position_vector.py

Holds a 3 axis position offset from home in meters

Last updated