Title: STAMINA

Description: Part handling during the assembly stages in the automotive industry is the only task with automation levels below 30% due to the variability of the production and to the diversity of suppliers and parts. The full automation of such task will not only have a huge impact in the automotive industry but will also act as a cornerstone in the development of advanced mobile robotic manipulators capable of dealing with unstructured environments, thus opening new possibilities in general for manufacturing SME's. The STAMINA project will use a holistic approach by partnering with experts in each necessary key fields, thus building on previous R&D to develop a fleet of autonomous and mobile industrial robots with different sensory, planning and physical capabilities for jointly solving three logistic and handling tasks: De-palletizing, Bin-Picking and Kitting. The robot and orchestration systems will be developed in a lean manner using an iterative series of development and validation testes that will not only assess the performance and usability of the system but also allow goal-driven research. STAMINA will give special attention to the system integration promoting and assessing the development of a sustainable and scalable robotic system to ensure a clear path for the future exploitation of the developed technologies. In addition to the technological outcome, STAMINA will allow to give an impression on how a sharing of work and workspace between humans and robots could look in the future.






Title: RADHAR

Description: RADHAR will develop a driving assistance system involving environment perception, driver perception and modelling, and robot decision making. RADHAR proposes a framework to seamlessly fuse the inherently uncertain information from both environment perception and the driver's steering signals by estimating the trajectory the robot should execute, and to adopt this fused information for safe navigation with a level of autonomy adjusted to the user's capabilities and desires. This requires lifelong, unsupervised but safe learning by the robot. As a consequence, a continuous interaction between two learning systems (the robot and the user) will emerge, hence Robotic ADaptation to Humans Adapting to Robots (RADHAR). The framework will be demonstrated on a robotic wheelchair platform that navigates in an everyday environment with everyday objects. RADHAR targets as main scientific outcomes: online 3D perception combining laser scanners and vision with traversability analysis of the terrain; novel paradigm for fusing environment and user perception and for safe robot navigation.






Title: TAPAS

Description: Robotics-enabled Logistics and Assistive Services for the Transformable Factory of the Future (TAPAS) is a project funded by the European Commission within FP7. The goal of TAPAS is to pave the ground for a new generation of transformable solutions to automation and logistics for small and large series production, economic viable and flexible, regardless of changes in volumes and product type. TAPAS pioneers and validates key components to realize this vision: mobile robots with manipulation arms will automate logistic tasks more flexible and more complete by not only transporting, but also collecting needed parts and delivering them right to the place were needed. TAPAS robots will even go beyond moving parts around the shop floor to create additional value: they will automate assistive tasks that naturally extend the logistic tasks, such as preparatory and post-processing works, e.g., pre-assembly or machine tending with inherent quality control. TAPAS robots might initially be more expensive than other solutions, but through this additional creation of value and by a faster adaptation to changes with new levels of robustness, availability, and completeness of jobs TAPAS robots promise to yield an earlier return of investment.






Title: First-MM

Description: Flexible Skill Acquisition and Intuitive Robot Tasking for Mobile Manipulation in the Real World is a project funded by the European Commission within FP7. The goal of First-MM to build the basis for a new generation of autonomous mobile manipulation robots that can flexibly be instructed to perform complex manipulation and transportation tasks. The project will develop a novel robot programming environment that allows even non-expert users to specify complex manipulation tasks in real-world environments. In addition to a task specification language, the environment includes concepts for probabilistic inference and for learning manipulation skills from demonstration and from experience. The project will build upon and extend recent results in robot programming, navigation, manipulation, perception, learning by instruction, and statistical relational learning to develop advanced technology for mobile manipulation robots that can flexibly be instructed even by non-expert users to perform challenging manipulation tasks in real-world environments. designed to autonomously navigate in urban environments outdoors as well as in shopping malls and shops to provide various services to users including guidance, delivery, and transportation.






Title: Lifenav

Description: The LifeNav project will develop the fundamental approaches required to design mobile robot systems that can reliably operate over extended periods of time in complex and dynamically changing environments. To achieve this, robots need the ability to learn and update appropriate models of their environment including the dynamic aspects and to effectively incorporate all the information into their decision-making processes. Within LifeNav we will develop effective and object-oriented three-dimensional representations that cover all aspects of the dynamic environment required for reliable and long-term mobile robot navigation. The outcome of this research will be relevant for all applications that are based on autonomous navigation in real-world scenarios including autonomous robots, mobile manipulation, transportation systems, or autonomous cars. LifeNav will demonstrate its capabilities in three different scenarios including dynamically changing office and factory environments as well as urban settings and rough terrains. As a challenging test bench, we will send the robot from downtown Freiburg to the near-by mountain Schauinsland with an elevation of 1.260 m which corresponds to a height difference of 1.000 m. The overall path has a length of more than 20 km and includes highly challenging foot paths through the forest with severe GPS outages.






Title: BACS

Description: Contemporary robots and other cognitive artifacts are not yet ready to autonomously operate in complex real world environments. One of the major reasons for this failure in creating cognitive situated systems is the difficulty in the handling of incomplete knowledge and uncertainty. By taking up inspiration from the brains of mammals, including humans, the BACS project will investigate and apply Bayesian models and approaches in order to develop artificial cognitive systems that can carry out complex tasks in real world environments. The Bayesian approach will be used to model different levels of brain function within a coherent framework, from neural functions up to complex behaviors. The Bayesian models will be validated and adapted as necessary according to neuro-physiological data from rats and humans and through psychophysical experiments on humans. The Bayesian approach will also be used to develop four artificial cognitive systems concerned with (i) autonomous navigation, (ii) multi-modal perception and reconstruction of the environment, (iii) semantic facial motion tracking, and (iv) human body motion recognition and behavior analysis. The conducted research shall result in a consistent Bayesian framework offering enhanced tools for probabilistic reasoning in complex real world situations. The performance will be demonstrated through its applications to drive assistant systems and 3D mapping, both very complex real world tasks. BACS is an Integrated Project under the 6th Framework program of the European Commission running from January 2006 to February 2010.






Title: SmartTer

Description: A project between ETH Zurich, EPFL in Lausanne and University of Freiburg to build an autonomous car.