Uni-Logo

Neurobots

Summary

Brain-controlled assistive robots hold the promise of restoring autonomy to paralyzed patients. Existing approaches are based on low-level, continuous control of robotic devices, resulting in a high cognitive load for their users. In the NeuroBots project, in contrast, we enhance prosthetic devices with a certain degree of autonomy and adaptivity to enable control on a higher cognitive level. To achieve this, we develop new methods and technologies in core areas of brain-machine interfaces, as well as artificial intelligence. This includes innovative approaches to brain-signal decoding with deep neural networks, efficient motion planning and improved perception for mobile robots and manipulators, novel methods for deep reinforcement learning, hierarchical planning with user feedback, and evaluation of formal methods for safety guarantees. The different components are continually integrated in an architecture based on the Robot Operating System (ROS), realizing a demonstrator of the BrainLinks-BrainTools LiNC concept.

Research Status

The most outstanding result of the project is a fully integrated system that realizes the BrainLinks-BrainTools LiNC concept, combining state-of-the-art online decoding of neuronal control signals with deep neural networks, high-level hierarchical planning with graphical user interface based on the planner's world knowledge, as well as novel perception and low-level robot planning algorithms for mobile robots. Improved high-level brain-signal decoding and closed-loop human-robot interaction are in the focus of current research.

Project Website

see Neurobots - Brain-controlled intelligent robotic devices

Contributing Research Groups

ECMR 2017: Acting Thoughts: Towards a Mobile Robotic Service Assistant for Users with Limited Communication Skills

Abstract

As autonomous service robots become more affordable and thus available also for the general public, there is a growing need for user friendly interfaces to control the robotic system. Currently available control modalities typically expect users to be able to express their desire through either touch, speech or gesture commands. While this requirement is fulfilled for the majority of users, paralyzed users may not be able to use such systems. In this paper, we present a novel framework, that allows these users to interact with a robotic service assistant in a closed-loop fashion, using only thoughts. The system is composed of several interacting components, i.e. non-invasive neuronal signal recording and decoding, high-level task planning, motion and manipulation planning as well as environment perception. In various experiments, we demonstrate its applicability and robustness in real world scenarios, considering fetch-and-carry tasks and tasks involving human- robot interaction. As our results demonstrate, our system is capable of adapting to frequent changes in the environment and reliably completing given tasks within a reasonable amount of time. Combined with high-level planning and autonomous robotics, interesting new perspectives open up for non-invasive BCI human-robot interactions.

Paper

  • Felix Burget, Lukas Dominique Josef Fiederer, Daniel Kuhner, Martin Völker, Johannes Aldinger, Robin Tibor Schirrmeister, Chau Do, Joschka Boedecker, Bernhard Nebel, Tonio Ball, Wolfram Burgard
    Acting Thoughts: Towards a Mobile Robotic Service Assistant for Users with Limited Communication Skills
    Proceedings of the IEEE European Conference on Mobile Robotics (ECMR) , Paris, France, 2017
    Download BibTeX

Benutzerspezifische Werkzeuge