Project

Back to overview

Autonomous vision controlled micro aerial vehicle

English title Autonomous vision controlled micro aerial vehicle
Applicant Pollefeys Marc
Number 135050
Funding scheme Project funding (Div. I-III)
Research institution Institut für Visual Computing ETH Zürich
Institution of higher education ETH Zurich - ETHZ
Main discipline Information Technology
Start/End 01.04.2011 - 31.03.2013
Approved amount 201'366.00
Show all

Keywords (4)

Mobile robotics; Micro aerial vehicle; Computer Vision; SLAM

Lay Summary (English)

Lead
Lay summary

 

Title:

Autonomous vision controlled micro aerial vehicle

Lead:

Flying robots are the logical next step in the field of mobile robotics.

The ability to fly allows us to use robots in totally new ways than what has been possible so far with ground based robots. Micro aerial vehicles are small scale robotic platforms for indoor or outdoor use. In this project we will equip micro aerial vehicles with digital cameras and develop computer algorithms that make the robot see and enable it to interact with its environment.

Hintergrund:

Micro aerial vehicles are small scale robotic flying platforms. As remote controlled drones they are already used for aerial photography or cartography. To enable a larger range of tasks the remote controlled vehicles need to be upgraded to fully autonomous flying robots.

The main challenge for autonomous mobile robots is to perceive and understand the world in which they operate from sensor readings. For navigation the robots need to figure out where it is save to go, where obstacles are and how to get from A to B. For most of its task the robots also need to identify and classify objects in its surrounding.

Ground based robots are usually equipped with a range of sensors to achieve to achieve this. However, the weight and power restrictions of micro aerial vehicles do not permit the use of the same sensors and methods. Instead, digital cameras could be a suitable option for light weight but capable sensors. This creates the need to develop image processing algorithms that will extract all the necessary information for the robot's operation from camera images. These algorithms need to be very efficient as they have to run on small-sized embedded controller on-board of the micro aerial vehicle. This is a necessity for true autonomy in which case we don’t want to have a permanent connection to a ground station for remote control.

Das Ziel:

The project’s goal is to develop computer algorithms that give the robot the ability to see and understand it's environment using digital cameras. At the end of the project the micro aerial vehicle should be able to use it's sensor readings for flight control, to navigate safely through a previously unknown environment, detect and avoid obstacles and safely operate among people.

Bedeutung:

This project will develop the foundations to use flying robots in real world applications. Applications will range from pure aerial mapping task for cartography to more complicated tasks like aiding in catastrophe management. Just by using them as flying cameras, they can take images of a disaster zone from view points unreachable from the ground and provide invaluable information for carrying out a rescue operation. They could explore dangerous areas autonomously and for instance systematically look for victims of accidents or natural disasters like earthquakes.

Direct link to Lay Summary Last update: 21.02.2013

Responsible applicant and co-applicants

Employees

Publications

Publication
Rolling Shutter Camera Calibration
Oth L., Furgale Paul, Kneip Laurent, Siegwart Roland (2013), Rolling Shutter Camera Calibration, in CVPR.
An Open Source and Open Hardware Embedded Metric Optical Flow CMOS Camera for Indoor and Outdoor Applications
Honegger Dominik, Meier Lorenz, Tanskanen Petri, Pollefeys Marc (2013), An Open Source and Open Hardware Embedded Metric Optical Flow CMOS Camera for Indoor and Outdoor Applications, in ICRA.
Real-time velocity estimation based on optical flow and disparity matching
Honegger Dominik, Greisen Pierre, Meier Lorenz, Tanskanen Petri, Pollefeys Marc (2012), Real-time velocity estimation based on optical flow and disparity matching, in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2012, .
Vision-based autonomous mapping and exploration using a quadrotor MAV
Fraundorfer Friedrich, Heng Lionel, Honegger Dominik, Lee Gim Hee, Meier Lorenz, Tanskanen Petri, Pollefeys Marc (2012), Vision-based autonomous mapping and exploration using a quadrotor MAV, in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2012, .
Visual-inertial SLAM for a small helicopter in large outdoor environments
Achtelik Markus, Lynen Simon, Weiss Stephan, Kneip Laurent, Chli Margarita, Siegwart Roland (2012), Visual-inertial SLAM for a small helicopter in large outdoor environments, in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2012, .
PIXHAWK: A Micro Aerial Vehicle Design for Autonomous Flight using Onboard Computer Vision
Meier Lorenz, Tanskanen Petri, Heng Lionel, Lee Gim Hee, Fraundorfer Friedrich, Pollefeys Marc (2012), PIXHAWK: A Micro Aerial Vehicle Design for Autonomous Flight using Onboard Computer Vision, in Autonomous Robots, 1-19.
Finding the Exact Rotation between Two Images Independently of the Translation
Kneip Laurent, Siegwart Roland, Pollefeys Marc (2012), Finding the Exact Rotation between Two Images Independently of the Translation, in Computer Vision - ECCV 2012 - 12th European Conference on Computer Vision, Florence, .
Real-Time 6D Stereo Visual Odometry with Non-Overlapping Fields of View
Kazik Tim, Kneip Laurent, Nikolic Janosch, Pollefeys Marc, Siegwart Roland (2012), Real-Time 6D Stereo Visual Odometry with Non-Overlapping Fields of View, in IEEE Conference on Computer Vision and Pattern Recognition.
Deterministic Initialization of Metric State Estimation Filters for Loosely-Coupled Monocular Vision-Inertial Systems
Kneip Laurent, Weiss Stephan, Siegwart Roland (2011), Deterministic Initialization of Metric State Estimation Filters for Loosely-Coupled Monocular Vision-Inertial Systems, in IEEE International Conference on Intelligent Robots and Systems.
THE PIXHAWK OPEN-SOURCE COMPUTER VISION FRAMEWORK FOR MAVS
Meier Lorenz, Tanskanen Petri, Fraundorfer Friedrich, Pollefeys Marc (2011), THE PIXHAWK OPEN-SOURCE COMPUTER VISION FRAMEWORK FOR MAVS, in Proceedings of the International Conference on Unmanned Aerial Vehicle in Geomatics (UAV-g) Zurich.
Robust Real-Time Visual Odometry with a Single Camera and an IMU
Kneip Laurent, Chli Margarita, Siegwart Roland (2011), Robust Real-Time Visual Odometry with a Single Camera and an IMU, in British Machine Vision Conference.
Monocular Vision for Long-Term MAV State-Estimation: A Compendium
Weiss Stephan, Achtelik Markus, Lynen Simon, Achtelik Michael C., Kneip Laurent, Chli Margarita, Siegwart Roland, Monocular Vision for Long-Term MAV State-Estimation: A Compendium, in Journal of Field Robotics.

Scientific events

Active participation

Title Type of contribution Title of article or contribution Date Place Persons involved
European Conference on Computer Vision 07.10.2012 Florenz, Italien
IEEE Conference on Computer Vision and Pattern Recognition 16.06.2012 Providence, Rhode Island, US
British Machine Vision Conference 29.08.2011 Dundee, Great Britain


Knowledge transfer events

Active participation

Title Type of contribution Date Place Persons involved
Live flight demonstration at TEDx talk 28.11.2012 Zürich


Communication with the public

Communication Title Media Place Year
New media (web, blogs, podcasts, news feeds etc.) Vision-Based Autonomous Mapping and Exploration Using a Quadrotor MAV German-speaking Switzerland 01.10.2012
Talks/events/exhibitions Flight demonstration at Treffpunkt Science City German-speaking Switzerland 06.11.2011
Talks/events/exhibitions Zurich Minds Flagship Event German-speaking Switzerland 13.12.2011

Associated projects

Number Title Start Funding scheme
125017 Autonomous vision-based micro-helicopter 01.04.2009 Project funding (Div. I-III)

Abstract

Autonomously flying robots are the logical heir of ground based mobile robots. The ability to fly eliminates many restrictions of ground based robots in terms of navigation and obstacle avoid-ance.In this proposal, we consider a micro aerial vehicle (MAV), in particular a quadrotor helicopter, as the base platform for a flying robot. In contrast to a ground robot, the control of such a MAV is more complicated. In particular, there is no safe state for it in the absence of any control inputs. A ground robot can easily be put to a safe stop by switching off the control inputs. A MAV however always needs control inputs to keep flying, in the absence of any control inputs it would crash. One of the challenges therefore is to create highly reliable basic control for a MAV, such that it can be handled in similar manners as a state of the art ground robot.Partially this has been achieved in the previous project (SNF:200021-125017) for controlled environments. There the focus has been on developing basic flight operations, e.g. vision con-trolled take-off, hovering and landing. Also local visual SLAM restricted to a small operation area could be achieved, as well as basic obstacle detection with visual sensors.The focus of this project is to extend the MAV’s capabilities for operation in larger, uncontrolled and dynamic environments. In order to achieve this, we propose to investigate along the follow-ing areas:1.Reliable vision based flight control using combined stereo vision and inertial readings2.Dense 3D environment mapping and scalable SLAM for MAVs3.Path planning and avoidance of dynamic and moving obstacles based on trajectory pre-dictionFor task 1, we will further the coupling of IMU and stereo vision. We will investigate how the IMU can be used to filter outliers in the vision estimates or in the feature matching. We will also analyze the integration of inertial measurements into the vision processing queue, on one hand to speed up the computation, and on the other hand to make the algorithms more robust in certain critical situations.For task 2 we will investigate 3D mapping by multi-view depth map fusion to enable path planning and obstacle avoidance for MAVs. The map representation needs to allow obstacle avoidance in 3D, e.g. flying above and below obstacles, which is a significant difference to ground robots. We will also investigate special properties of MAVs to create a well targeted SLAM algorithm which is lightweight and scalable. The attitude control of a MAV together with accurate attitude measurements restricts the motion to a quasi-planar motion. This can be exploited in a visual SLAM algorithm.In task 3 we will investigate path planning and advanced obstacle avoidance methods especially targeted for MAV, using the increased movement capabilities compared to ground robots. Here we also would like to investigate strategies to avoid dynamic and moving objects. From camera input, we want to detect, track, and predict moving objects, and use this information to compute an optimal avoidance strategy.Robotics and this project especially require interdisciplinary research. The project needs exper-tise in MAV robotics research as well as computer vision research. These necessary compe-tences can be provided by collaboration of our two research groups, the Computer Vision and Geometry Lab (CVG) and the Autonomous Systems Lab (ASL).
-