Autonomous Drone Navigation
Published:
During the pandemic, I was at home without any access to any physical robots. It was the perfect time to put into practise the concepts of Robotics that I learnt in the summer. I got quite intriguied by Autonomous drones. My greatest inspirations came after watching this video on Deep Drone Racing by the Robotics and Perception group at UZH and a bunch of really cool work by the Autonomous Robots Lab at NTNU. Taking inspiration, I wanted to try out working with Indoor Navigation of Autonomous Drones.
Simulator
I was in search of a simulator to work on Drones. I found rotors_simulator from ETHZ-ASL. Unfortunately, it was hard to install it on Ubuntu 18.04 with Gazebo 9. So I installed CrazyS, an extension of RotorS. This repository has proper instructions for installation in Ubuntu 18.04 and with Gazebo 9.
Benifits of using a Simulator
The obvious benifit of using a simulator is the absence of need for the actual drone. However, some other benifits that I feel were important for my project were:
- Odometry : The ground truth plugin from Gazebo made it really easy to obtain accurate odometry. This allowed me to concentrate on specific parts of the project that I was interested on.
- VI Sensor : A Visual Inertial Sensor was simulated. This allowed me to access the depth image from the sensor to further use it in perception.
- Controller : The launch file launched
lee_position_controller_node
which had a controller implemented. This made the job really easy for me to concentrate on planning and perception only.
Launching the drone simulation
- Launch
firefly
hovering (without VI Sensor)roslaunch rotors_gazebo mav_hovering_example.launch
- Launch
firefly
hovering (with VI Sensor)roslaunch rotors_gazebo mav_hovering_example_with_vi_sensor.launch
This launches the drone in an empty world. To change the world, you can pass the world as an argument or take a modify the launch file
Perception
The VI Sensor on board publishes Point Cloud Data as sensor_msgs/PointCloud2
message type. To perform collision checking, I tried two methods:
Point Cloud Library (PCL)
I used the Point Cloud Library to filter
, transform
and perform search
of the Point Cloud.
Filtering
The Point Cloud data that was published was very dense and noisy. So I applied Pass Through Filter
and Voxel Grid Filter
to reduce the distance and density of the incoming Point Cloud.
// Pass Through Filter
pcl::PassThrough<pcl::PointXYZ> pass;
pass.setInputCloud(cloud);
pass.setFilterFieldName("z");
pass.setFilterLimits(0.0, 3.0);
pass.filter(*cloud_pt);
// Voxel Grid Filter
pcl::VoxelGrid<pcl::PointXYZ> sor;
sor.setInputCloud(cloud_pt);
sor.setLeafSize(0.05, 0.05, 0.05);
sor.filter(*cloud_voxel);
Transformation
The incoming Point Cloud was in the vi_sensor
frame. To perform collision checking in the world
frame, the Point Cloud had to be transformed.
try
{
geometry_msgs::TransformStamped transform;
transform = tfBuffer.lookupTransform("world", msg->header.frame_id ,ros::Time(0), ros::Duration(3.0));
tf::Transform tftrans;
tf::transformMsgToTF(transform.transform, tftrans);
pcl_ros::transformPointCloud(*cloud_voxel, cloud_out, tftrans);
}
catch(tf2::TransformException &ex)
{
ROS_WARN("%s", ex.what());
}
Search
To check for collision of any point in space given the Point Cloud data, I used the inbuilt Octree
representation of Point Cloud to check if the queried point was surrounded by any Points.
pcl::octree::OctreePointCloudSearch<pcl::PointXYZ> octree(resolution);
octree.setInputCloud(cloud);
octree.addPointsFromInputCloud();
if(octree.radiusSearch(searchPoint, radius, idx, distances) > 0){
resp.collision = true;
// Collsion
}
else{
resp.collision = false;
// No collision
}
PCL References
Voxblox
Voxblox is a voxel-based volumetric mapping library. It uses Truncated Distance Fields and Euclidean Distance Fields to build maps. It has tight ROS integration and can be run on CPU only. I used Voxblox to perform collision checks. The section on using Voxblox for Planning is very useful.
Planning
The main aim of the project was to write a 3D Planner in Indoor Environments. I decided to go through with a Sampling based Algorithm and post processing using mav_trajectory_generation from ASL-ETHZ.
Sampling based Planner
- I used RRT to generate samples in 3D withing ranges of +/- 2m from the Drones current altitude. This increased the probability of a solution. The algorithm was biased to sample the goal every
x
out of 10 samples. - Since the algorithm is not optimal and is known to produce paths that are not trackable by a drone without any processing, a LOS Optimiser was used before the path was optimised for Drones. The Post Processing Pipeline is given below
Post Processing
As mentioned, the path geneated by RRT is not continous. Hence post processing of the Path is necessary to use it for UAV. The path was first processed by an Line of Sight
optimiser and further sent to mav_trajectory_generation
Line of Sight Optimiser
Line of Sight optimiser aims to construct straight collision free paths between the intermediate Points in the Path.
current_index = 0
while current_index < len(path) - 1:
ind_updated = False
for ind2 in range(len(path) - 1, current_index, -1):
if not collision(path[current_index].pose.position, path[ind2].pose.position):
optimised_path.append(path[ind2])
current_index = ind2
ind_updated = True
break
if not ind_updated:
return optimised_path
MAV Trajectory Generation
- This repository contains Polynomial Trajectory Generation and Optimisation methods for Path Generation. The repository has a good documentation to get started.
- I used Non Linear Optimisation to generate smooth Trajectory for the UAV
mav_trajectory_generation::PolynomialOptimizationNonLinear<N> opt(dimension, params);
opt.setupFromvertices(vertices, segment_times, derv);
opt.addMaximumMagnitudeConstraint(mav_trajectory_generation::derivative_order::VELOCITY, v_max);
opt.addMaximumMagnitudeConstraint(mav_trajectory_generation::derivative_order::ACCELERATION, a_max);
opt.optimize();
mav_trajectory_generation::Trajectory mav_trajectory;
opt.getTrajectory(&mav_trajectory);
For more information, refer to the source code
Conclusion
This ended my project on Autonomous Navigation using Gazebo. There were a lot of concepts (Control
, State Estimation
, Computational Requirements
, Noise from real Hardware
) assumed in this project. In the future, I aim to learn more about the concepts behind the algorithms and implement the project on real hardware.