The goal of this project is to connect to your drone’s Rasberry Pi and understand all the sensors and actuators. It’s a good idea to take the propellors off your drone for this project. If there are no propellers, then the drone can’t move, even if some program sends incorrect commands to the motor.
Calibrate the IMU by running calibrateAcc.py
.
For the next part of this exercise, you will need an office-style chair that can rotate and spin. We will compare the accuracy of the IMU on the drone with your own internal IMU provided by your Vestibular system. You will need a partner for this part of the exercise.
Sit in the chair and close your eyes or blindfold yourself. Have a partner spin the chair 3 times and stop at a random orientation. While still blindfolded, guess the orientation. Then open your eyes and write down the true orientation (estimate as best you can or use a compass on your cell phone).
Fill out the following table for each person and each drone. When doing the people randomize the order of the trials.
Trial | Estimated Orientation | Ground Truth | Error |
---|---|---|---|
Spin completely around | __ | 360 | __ |
Spin half way | __ | 180 | __ |
Spin a quarter way clockwise | __ | 90 | __ |
Spin a quarter way counterclockwise | __ | 270 | __ |
Random | __ | __ | __ |
Random | __ | __ | __ |
Random | __ | __ | __ |
Random | __ | __ | __ |
Random | __ | __ | __ |
Random | __ | __ | __ |
Random | __ | __ | __ |
Average Error | __ | __ | __ |
At the end of your work you should be able to fill out the following table:
Sensor | Average Error |
Student 1 | __ |
Student 2 | __ |
Drone 1 | __ |
Drone 2 | __ |
Which person is more accurate? Which drone? Hopefully the two drones have similar accuracy; otherwise we might need to calibrate the IMU.
Your drone is equipped with a Sharp 2Y0A21YK0F Infrared Distance Sensor, which is used for estimating the distance from the drone to the ground-plane. The sensor outputs an analog voltage signal (roughly 3.1V at 0.08 meters to 0.4V at 0.8 meters), which is converted to digital by our Analog to digital Converter (ADC) and read in by the Raspberry Pi as a 12 bit integer. The value corresponds to distance, but we are going to need to do some work to convert it to real-word units.
In scripts/student_infrared_pub.py
you will implemenet the function calc_distance
, which takes in this 12bit
voltage returns a distance in meters. Note that distance is inversely
proportional to voltage \(x = 1/V\) and you will need to both rescale and
offset your distance: \(x = m*(1/V) + b\).
Every sensor is a little bit different (we estimate by as much as 10%), so in order to maximize the performance of your drone you will need to calibrate your sensor. You an do this by simply measuring (with a ruler or tape measure) to estimate your paramters. You will need to take at least two measurements between 0.1m and 0.5m from the sensor; more measurements will yield a better estimate of the parameters.
In scripts/student_infrared_pub.py
you will implement exponential moving average smoothing in the function exp_smooth
. This reduces the high-frequency noise in your sensor reading which keeps the error entering your altitude PID-controller much smoother, however, smoothing this way comes at the cost of artifical latency.
You are now collecting and processing sensor data! It’s time to hook everything up to ROS so we can use those values elsewhere in the system.
In the main of scripts/student_infrared_pub.py
we provide the details of the steps to publish your smoothed sensor data. You will be publishing a ROS Range message which is a standard message included with ROS.
/pidrone/infrared
with a queue_size of 1You can check that your publishing is working by running python student_infrared_pub.py
in tick 4, and then echoing the topic /pidrone/infrared
. If everything is correct your javascript interface display the values as well.