AR.Drone camera based tracking. In the actual phase it has 5 modes, a green laser pointer tracking, which can use any of the two cameras, a led strip follower with the bottom camera (like a line follower robot, but with a drone), a path follower with user defined paths and a keyboard control for the drone.
This will give you an overview of the project.
The foundation of the project is based on ROS(Robot Operating System). It manages the communication of the algorithm on the PC with the drone, you can learn more about it on their wiki page. In the ROS environment there is one thing called packages, you can think of them as libraries, that make one's life easier. This project uses some of them, a interesting one being used is the ardrone_autonomy package. It manages the AR.Drone SDK, making the control of the drone by software to be very easy.
Since this is a camera based tracking project it needs to do image processing, which can be quite easy with OpenCV. The image processing algorithm it is using right now is based on a color tracking approach. It creates a mask based on a RGB threshold then applies it on the original image, as a result we have only the green dot. As a final step it recognizes the green dot as a circle and sends the x,y position to the control algorithm. (All the image processing algorithms are in learning.py)
For the tracking it uses a simple PID controller. The constants are optimized for the AR.Drone we used, so you may need to change them (in controller.py).
The front camera mode uses a PID controller in roll, altitude and pitch. The PIDs of roll and altitude are associated with the green pointer position. The pitch PID is associated with the position of the drone, it tries to make the drone stay in the same place. (You can change it at controller.py)
The bottom camera mode uses a PID controller in roll and pitch. Both of them are associated with the green pointer position. (You can change it at controller.py)
The line follower mode uses a PID controller in roll and yaw. The roll PID tries to centralize the drone with the led strip and the yaw PID tries to keep the drone parallel to the strip. (You can change it at controller.py)
The waypoint mode uses a PID controller in roll, pitch and altitude. All of them are associated with user defined positions the drone needs to reach. (You can change it at waypoint.py)
The things you need to run the project and how to install them. The project was developed and tested in Ubuntu 16.04 LTS, if your operating system is different it may need some workarounds.
To install ROS Kinetic you just need to follow the instructions in the ROS installation page.
sudo apt-get install ros-kinetic-ardrone-autonomy
sudo apt-get install ros-kinetic-joy ros-kinetic-joystick-drivers
sudo apt-get install ros-kinetic-cv-bridge
If you are using Ubuntu 16.04 Python comes pre-installed.
You also need to install Qt.
sudo apt-get install python-pyside
To install OpenCV you just need to follow the instructions in this page.
sudo pip install dropbox
To install this package run these commands in the terminal.
Navigate to your catkin workspace. If you do not have a catkin workspace, follow this tutorial.
cd ~/catkin_ws/src
Clone this repository
git clone https://github.com/nishiratavo/Drone_vision.git
Once it is completed
cd ~/catkin_ws
catkin_make
To upload data to Dropbox you need to create an app in your dropbox account. Then click the "generate access token" button and cut/paste into the gui.py line 146 in place of <auth_token>
.
First turn on the AR.Drone and connect to it via wifi. Then open the terminal and navigate to your Drove_vision source folder (Probably this path if you're using catkin workspace).
cd ~/catkin_ws/src/Drone_vision/src/
Initialize the program with the bash script.
./drone.sh
Wait for everything to open and then select the mode.
Wait for the camera's image to open (the path follower doesn't use the camera) and then send the takeoff command.
To land the drone send the land command.
In case of emergency send the emergency command.
Wait for the drone to stabilise then use the green pointer and the drone will follow.
Wait for the drone to stabilise and wait some seconds for the drone to achieve his final altitude. After that use the green pointer and the drone will follow.
Wait for the drone to stabilise and it will begin to follow the led strip.
Wait for the drone to stabilise and send the path it should follow.
The path coordinates are in meters and they say how many meters the drone should fly in each direction.
Example :
1,0,0 => one meter to the left
0,1,0 => one meter forward
0,0,1 => one meter up
To send more than one vector you need to separate the vectors with one space.
To fly in a square beginning in the bottom left corner send :
0,1,0 -1,0,0 0,-1,0 1,0,0
Use the keyboard to control the drone.
W -> forward
A -> left
S -> backward
D -> right
Q -> counter clockwise
E -> clockwise
R -> up
F -> down
Write the name of the file on the textbox. Click the save data button to start saving data, click again to stop. It will generate a data.txt file in the package directory.
CAUTION
If you press the save data button again it will overwrite the previous data.
To upload the data to your dropbox account just click the upload button.