User Tools

Site Tools


research:cogman

CogMan: Cognitive manipulation

The CogMan project (1) develops computational and control models of pick-and-place tasks in the context of everyday manipulation activities in human environments, (2) implements the model into a control system for the kitchen scenario, and (3) empirically analyzes the impact of this control model on the flexibility, robustness, adaptability, and naturality of the robot behavior.


Project details

A very impressive aspect of human manipulation of objects in their everyday environments is that they select and parametrize their reaching and grasping movements very skillfully based on the properties of the objects they are to pick up, on the situational circumstances, and on the end positions of the objects. Actions with two collaborators are also adjusted accordingly. The results are very smooth, predictable, and efficient compound movements. Not only do people perform these actions skillfully but they are also capable of learning these skills from little experience and adapt them automatically when needed. Robots that are to serve as robotic assistants and work together with humans need a similar level of manipulation skills and equally powerful means for skill acquisition and adaptation.

Human-acquired motion

Based on experiments analyzing human manipulation tasks from the cognitive psychology point of view, we develop a model of how humans select and parametrize grasps, reaching movements, standing positions, and destinations based on the task context, the object, and the goals of the manipulation activity. Using these information pieces we learn behavior models from the observed data and interpret the findings in the light of cognitive control model.

Object models and action language

Using mechanical models of objects, a system can predict what is going to happen to an object when a force is applied to it. A robot can use this knowledge to learn how to exert forces to the object to obtain an specific desired goal. Finding a mapping between this object control and a symbolic representation for it can lead to a language that can be used for robot control.

The novel aspect of the CogMan research approach is that we perform comprehensive experiments of human everyday manipulation activities taking the context into account. Therefore, we use more comprehensive sensor data including estimation of fullbody pose, hand pose, visual attention, various biosignals, model and pose of the manipulated object, force used for lifting, kind of grasp, grasp points, local scene around the object, activity context, etc.

Constraint-based motion control

When humans perform even very simple reaching movements, they obey a number of constraints for not colliding, maximizing robustness in the face of uncertainties in perception and actuation. At the same time, the freedoms, that are available in human redundant arms and the task itself are exploited to reduce jerk, execution time and control effort. This optimization takes into account “motor noise” and leads to very high variations in human movements, even under very constrained lab conditions.

One goal of CogMan is to extract these constraints from human motions and represent them in a geometrical manner. Additional constraint- or freedom-'directions' can be taught directly by moving the robot arm, to complete this set of constraints. They are represented using the iTaSC framework, which can combine these contraints into an instantaneous motion controller. It allows us to start from engineered motion controllers and incrementally add constraints and freedoms, and we can check continuously the effectiveness of every added constraint.

Acknowledgements

This project is partly funded by CoTeSys.

Publications

Imitating human reaching motions using physically inspired optimization principles. Sebastian Albrecht, Karinne Ramirez Amaro, Federico Ruiz-Ugalde, David Weikersdorfer, Marion Leibold, Michael Ulbrich, Michael Beetz. 11th IEEE-RAS International Conference on Humanoid Robots. 2011.

Journal Articles and Book Chapters

Generality and Legibility in Mobile Manipulation (Michael Beetz, Freek Stulp, Piotr Esden-Tempski, Andreas Fedrizzi, Ulrich Klank, Ingo Kresse, Alexis Maldonado, Federico Ruiz), In Autonomous Robots Journal (Special Issue on Mobile Manipulation), volume 28, 2010. [bib] [pdf]

Conference Papers

The RoboEarth language: Representing and Exchanging Knowledge about Actions, Objects, and Environments (Moritz Tenorth, Alexander Clifford Perzylo, Reinhard Lafrenz, Michael Beetz), In IEEE International Conference on Robotics and Automation (ICRA), 2012.(Best Cognitive Robotics Paper Award.) [bib] [pdf]
Improving robot manipulation through fingertip perception (Alexis Maldonado, Humberto Alvarez-Heredia, Michael Beetz), In IEEE International Conference on Intelligent Robots and Systems (IROS), 2012. [bib] [pdf]
Movement-aware Action Control -- Integrating Symbolic and Control-theoretic Action Execution (Ingo Kresse, Michael Beetz), In IEEE International Conference on Robotics and Automation (ICRA), 2012. [bib] [pdf]
A Generalized Framework for Opening Doors and Drawers in Kitchen Environments (Thomas Rühr, Jürgen Sturm, Dejan Pangercic, Michael Beetz, Daniel Cremers), In IEEE International Conference on Robotics and Automation (ICRA), 2012. [bib] [pdf]
Multimodal Autonomous Tool Analyses and Appropriate Application (Ingo Kresse, Ulrich Klank, Michael Beetz), In 11th IEEE-RAS International Conference on Humanoid Robots, 2011. [bib] [pdf]
Fast adaptation for effect-aware pushing (Federico Ruiz-Ugalde, Gordon Cheng, Michael Beetz), In 11th IEEE-RAS International Conference on Humanoid Robots, 2011. [bib] [pdf]
Robotic Roommates Making Pancakes (Michael Beetz, Ulrich Klank, Ingo Kresse, Alexis Maldonado, Lorenz Mösenlechner, Dejan Pangercic, Thomas Rühr, Moritz Tenorth), In 11th IEEE-RAS International Conference on Humanoid Robots, 2011. [bib] [pdf]
Prediction of action outcomes using an object model (Federico Ruiz-Ugalde, Gordon Cheng, Michael Beetz), In 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010. [bib] [pdf]
Robotic grasping of unmodeled objects using time-of-flight range data and finger torque information (Alexis Maldonado, Ulrich Klank, Michael Beetz), In 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010. [bib] [pdf]
Learning and Performing Place-based Mobile Manipulation (Freek Stulp, Andreas Fedrizzi, Michael Beetz), In Proceedings of the 8th International Conference on Development and Learning (ICDL)., 2009. [bib] [pdf]
Compact Models of Motor Primitive Variations for Predictable Reaching and Obstacle Avoidance (Freek Stulp, Erhan Oztop, Peter Pastor, Michael Beetz, Stefan Schaal), In 9th IEEE-RAS International Conference on Humanoid Robots, 2009. [bib] [pdf]
Compact Models of Human Reaching Motions for Robotic Control in Everyday Manipulation Tasks (Freek Stulp, Ingo Kresse, Alexis Maldonado, Federico Ruiz, Andreas Fedrizzi, Michael Beetz), In Proceedings of the 8th International Conference on Development and Learning (ICDL)., 2009. [bib] [pdf]
Combining Analysis, Imitation, and Experience-based Learning to Acquire a Concept of Reachability (Freek Stulp, Andreas Fedrizzi, Franziska Zacharias, Moritz Tenorth, Jan Bandouch, Michael Beetz), In 9th IEEE-RAS International Conference on Humanoid Robots, 2009. [bib] [pdf]
Action-Related Place-Based Mobile Manipulation (Freek Stulp, Andreas Fedrizzi, Michael Beetz), In Proceedings of the International Conference on Intelligent Robots and Systems (IROS), 2009. [bib] [pdf]
Transformational Planning for Mobile Manipulation based on Action-related Places (Andreas Fedrizzi, Lorenz Moesenlechner, Freek Stulp, Michael Beetz), In Proceedings of the International Conference on Advanced Robotics (ICAR)., 2009. [bib] [pdf]
How humans optimize their interaction with the environment: The impact of action context on human perception. (Agnieszka Wykowska, Alexis Maldonado, Michael Beetz, Anna Schuboe), In Progress in Robotics. Proceedings of the FIRA RoboWorld Congress, 2009. [bib] [pdf]
Obstacle avoidance in a pick-and-place task (Jun Li, Alexis Maldonado, Michael Beetz, Anna Schuboe), In Proceedings of the 2009 IEEE Conference on Robotics and Biomimetics, 2009. [bib]
Subsequent Actions Influence Motor Control Parameters of a Current Grasping Action (Anna Schubö, Alexis Maldonado, Sonja Stork, Michael Beetz), In IEEE 17th International Symposium on Robot and Human Interactive Communication (RO-MAN), Muenchen, Germany, 2008. [bib]
The Assistive Kitchen -- A Demonstration Scenario for Cognitive Technical Systems (Michael Beetz, Freek Stulp, Bernd Radig, Jan Bandouch, Nico Blodow, Mihai Dolha, Andreas Fedrizzi, Dominik Jain, Uli Klank, Ingo Kresse, Alexis Maldonado, Zoltan Marton, Lorenz Mösenlechner, Federico Ruiz, Radu Bogdan Rusu, Moritz Tenorth), In IEEE 17th International Symposium on Robot and Human Interactive Communication (RO-MAN), Muenchen, Germany, 2008.(Invited paper.) [bib] [pdf]

Workshop Papers

Robotic Roommates Making Pancakes - Look Into Perception-Manipulation Loop (Michael Beetz, Ulrich Klank, Alexis Maldonado, Dejan Pangercic, Thomas Rühr), In IEEE International Conference on Robotics and Automation (ICRA), Workshop on Mobile Manipulation: Integrating Perception and Manipulation, 2011. [bib] [pdf]
The Assistive Kitchen --- A Demonstration Scenario for Cognitive Technical Systems (Michael Beetz, Jan Bandouch, Alexandra Kirsch, Alexis Maldonado, Armin Müller, Radu Bogdan Rusu), In Proceedings of the 4th COE Workshop on Human Adaptive Mechatronics (HAM), 2007. [bib] [pdf]
Powered by bibtexbrowser
Export as PDF or BIB

research/cogman.txt · Last modified: 2014/02/19 19:51 by ramirezk