Project

Back to overview

UFO: Semi-Autonomous Aerial Vehicles for Augmented Reality, Human-Computer Interaction and Remote Collaboration

English title UFO: Semi-Autonomous Aerial Vehicles for Augmented Reality, Human-Computer Interaction and Remote Collaboration
Applicant Hilliges Otmar
Number 153644
Funding scheme Project funding
Research institution Professur für Informatik ETH Zürich
Institution of higher education ETH Zurich - ETHZ
Main discipline Information Technology
Start/End 01.05.2015 - 30.04.2019
Approved amount 479'957.00
Show all

Keywords (8)

Gestural Interfaces; Human Computer Interaction; Remote Collaboration; Augmented Reality; Human Robot Interaction; Natural User Interfaces; Robotics; Micro Aerial Vehicles (MAVs)

Lay Summary (German)

Lead
Die Forschung in Sachen kompakter Flugroboter, oder micro-aerial vehicles (MAVs), hat in den letzten Jahren zu grosse Fortschritte gemacht. Daher ist es abzusehen, dass diese in wenigen Jahren eine wichtige Rolle in unserem Alltag spielen werden. Die Fragestellung wie Menschen und fliegende Roboter, auf engstem Raum, interagieren und kollaborieren gewinnt also zunehmend an Bedeutung. In diesem Projekt werden neuartige Mensch-Maschine Schnittstellen entwickelt und erforscht, die MAVs als fliegende Interaktionsplattformen interpretieren. Insbesondere geht es darum, zu erforschen wie Flugroboter genutzt werden können, um Interaktion mit virtuellen 3D Daten in mobilen Szenarios und in unbekannten, unstrukturierten Umgebungen zu ermöglichen.
Lay summary

Inhalt und Ziele des Forschungsprojekts:

Das übergeordnete Ziel des Projektes ist es ein neues Paradigma für Mensch-Maschine Schnittstellen zu entwickeln, dass semi-autonome Flugroboter verwendet, um die Wahrnehmung des Benutzers mit virtuellen Informationen zu erweitern. Dabei werden Flugrobotor sowohl als Eingabegerät als auch als Ausgabegerät fungieren. Der Benutzer muss also keinerlei Instrumente am Körper tragen oder in den Händen halten. Im Detail werden wir Methoden 1) zur Flugstabilisierung relativ zum Benutzer 2) zur Projektion von virtuellen Informationen 3) zur Erkennung von natürlichen Gesten und Interaktionen und 4) zur Interaktion mit den projizierten Inhalten entwickeln. Insbesondere werden wir neuartige Algorithmen zur Erkennung von natürlichen Interaktionen in dynamischen, unkontrollierbaren Umgebungen entwickeln. Diese natürlichen Benutzereingaben werden dann vom Flugroboter interpretiert und zur Navigation, Planung und Berechnung von Flugdynamik verwendet.

Wissenschaftlicher und gesellschaftlicher Kontext des Forschungsprojekts:

Unsere Arbeit wird neue Methoden und Paradigmen der Benutzerinteraktion und Mensch-Maschine Kommunikation hervorbringen. Ausserdem wird die Arbeit erste Erkenntnisse liefern wie eine Zukunft in der Menschen und intelligente Roboter auf engstem Raume kollaborieren zu gestalten ist und kann somit fundamentalen Einfluss darauf nehmen wie sich unser Alltag in der Zukunft gestaltet.

 

Direct link to Lay Summary Last update: 24.06.2014

Lay Summary (English)

Lead
Research into compact flying robots, or micro aerial vehicles (MAVs), has made tremendous progress in recent years. Therefore, it is conceivable that they will play an important role in our daily lives in a few years. The question of how people and flying robots, inhibiting the same space, interact and collaborate is therefore becoming increasingly important. In this project, new man-machine interfaces will be developed and explored that interpret MAVs as flying interaction platform. In particular the aim is to explore how flying robots can be used to enable interaction with virtual 3D content in mobile scenarios and in unknown, unstructured environments.
Lay summary

Content and objectives of the research project

The overall objective of the project is to develop a new paradigm for human-machine interfaces that leverages semi-autonomous flying robots to expand the user’s perception of the real world with virtual information. This flying robot will act as an input device, as well as an output device. The user must not carry any instrumentation on the body or hold it in their hands. In particular, we will develop methods 1) to stabilize flight relative to the user 2) for the projection of virtual information 3) for the detection of natural gestures and interactions and 4) to interact with the projected contents. Specifically, we will develop new algorithms for the detection of natural interactions in dynamic, uncontrollable environments. These natural user inputs are then interpreted by the aerial robots and used during navigation, planning and calculation of flight dynamics. 


Scientific and social context of the research project

Our work will produce new methods and paradigms of user interaction and human-machine communication. Also, the work will provide first insights into a future in which people and intelligent robots collaborate in closest vicinity to eachother. Hence, the project can fundamentally influence our future everyday life.


 

Direct link to Lay Summary Last update: 24.06.2014

Responsible applicant and co-applicants

Employees

Publications

Publication
Demonstration-Guided Deep Reinforcement Learning of Control Policies for Dexterous Human-Robot Interaction
Christen Sammy, Stevsic Stefan, Hilliges Otmar (2019), Demonstration-Guided Deep Reinforcement Learning of Control Policies for Dexterous Human-Robot Interaction, in 2019 International Conference on Robotics and Automation (ICRA), Montreal, CanadaIEEE, New Jersey.
Sample Efficient Learning of Path Following and Obstacle Avoidance Behavior for Quadrotors
Stevsic Stefan, Nageli Tobias, Alonso-Mora Javier, Hilliges Otmar (2018), Sample Efficient Learning of Path Following and Obstacle Avoidance Behavior for Quadrotors, in IEEE Robotics and Automation Letters, 3(4), 3852-3859.
Optimizing for aesthetically pleasing qadrotor camera motion
Gebhardt Christoph, Stevšić Stefan, Hilliges Otmar (2018), Optimizing for aesthetically pleasing qadrotor camera motion, in ACM Transactions on Graphics, 37(4), 1-11.
WYFIWYG: Investigating Effective User Support in Aerial Videography
Gebhardt Christoph, Hilliges Otmar (2018), WYFIWYG: Investigating Effective User Support in Aerial Videography, arXiv.org, New York.
AdaM: Adapting Multi-User Interfaces for Collaborative Environments in Real-Time
Park Seonwook, Oulasvirta Antti, Hilliges Otmar, Gebhardt Christoph, Rädle Roman, Feit Anna Maria, Vrzakova Hana, Dayama Niraj Ramesh, Yeo Hui-Shyong, Klokmose Clemens N., Quigley Aaron (2018), AdaM: Adapting Multi-User Interfaces for Collaborative Environments in Real-Time, in the 2018 CHI Conference, Montreal QC, CanadaACM, New York.
Airways: Optimization-Based Planning of Quadrotor Trajectories according to High-Level User Goals
Gebhardt Christoph, Hepp Benjamin, Nägeli Tobias, Stevšić Stefan, Hilliges Otmar (2016), Airways: Optimization-Based Planning of Quadrotor Trajectories according to High-Level User Goals, in the 2016 CHI Conference, Santa Clara, California, USAACM, New York.

Collaboration

Group / person Country
Types of collaboration
Microsoft Research Cambridge Great Britain and Northern Ireland (Europe)
- in-depth/constructive exchanges on approaches, methods or results
- Publication

Scientific events

Active participation

Title Type of contribution Title of article or contribution Date Place Persons involved
2019 International Conference on Robotics and Automation (ICRA) Poster Demonstration-Guided Deep Reinforcement Learning of Control Policies for Dexterous Human-Robot Interaction 20.05.2019 Montreal, Canada Stevsic Stefan;
ACM SIGGRAPH 2018 Talk given at a conference Optimizing for Aesthetically Pleasing Quadrotor Camera Motion. 12.08.2018 Vancouver, Canada Gebhardt Christoph Martin;
Second Max Planck ETH Workshop on Learning Control Poster Sample Efficient Learning of Path Following and Obstacle Avoidance Behavior for Quadrotors 08.02.2018 Zurich, Switzerland Stevsic Stefan;
CS Forum University of Aalto Individual talk User-in-the-loop Trajectory Optimization for Aerial Videography. 10.10.2017 Espoo, Finland Gebhardt Christoph Martin;
ACM SIGGRAPH 2017 Talk given at a conference Real-time Planning for Automated Multi-View Drone Cinematography 30.07.2017 Los Angeles, United States of America Hilliges Otmar;
ACM CHI 2016 Talk given at a conference Airways: Optimization-Based Planning of Quadrotor Trajectories according to High-Level User Goals. 07.05.2016 San Jose, United States of America Gebhardt Christoph Martin;


Use-inspired outputs


Start-ups

Name Year

Abstract

In recent years, mobile touch devices are increasingly replacing or supplementing desktop computing. However, touch devices require direct physical and visual attention. Both are extremely scarce resources in mobile situations. Research on Augmented Reality (AR) and Wearable Computing has long attempted to address these issues with alternative user interfaces that augment the user’s perception of the real world. Traditionally, this has been achieved by instrumenting the user with head-worn displays, tracking devices etc. As a result, trade-offs concerning ergonomics, interactive possibilities and data richness were unavoidable.We propose a radically new type of natural user interface (NUI) that brings together elements from robotics, AR, and ubiquitous computing. We intend to build the “user’s flying organizer” (UFO), a semi-autonomous micro aerial vehicle (MAV) equipped with pico-projectors and cameras that allow for projected interactive pixels anywhere in the user’s environment. Whilst MAVs have been explored in the context of autonomous flight, this project will for the first time explore how these can be made fully interactive, working in synergy directly with users in real-world environments to enable new application scenarios, including:1) The spontaneous visualization of data directly on physical surfaces around the user, turning the environment into an ad-hoc smart space augmented with graphics and interactive possibilities. The planned system allows UFOs to navigate freely in the environment, projecting content where needed, without user instrumentation. 2) New forms of mobile computing. Allowing a user to “see” around corners or into otherwise physically unreachable locations via the drone, for example in search and rescue scenarios, whilst maintaining access to digital information. 3) New forms of teleconferencing, where a remote participant may leverage the drone to experience the user’s surroundings and to project information into the environment to jointly perform a task that otherwise would require co-presence. The proposed project will have impact on fundamental human computer interaction (HCI) research. The primary outcomes will include the development of new insights, methods and tools for NUI interaction, their realization in novel algorithms (e. g., new gesture recognition techniques), their implementation as open source software and hardware (e. g., MAV platform and sensing technologies), and their application in challenging real-world scenarios.
-