» Objectives
» Topics of interest
» Program
» Invited Speakers
» Submission of abstracts
» Organizers
» Contact

Representations for object grasping and manipulation

Object grasping and manipulation is of significant importance for robot systems interacting with the environment. However, there are no a general, widely accepted representations of sensory data that can be used in different applications and across different embodiments. There are systems relying only on visual feedback and those that integrate visual and haptic feedback but in very constrained situations. The goal of this workshop is to bring together researchers from computer vision, machine learning and control to discuss the future avenues in the area of sensory based object grasping and manipulation. Of special interest are systems that are based on multiple sensory input and have been demonstrated to perform in natural settings with different types of objects.
The increasing demand for robotic applications in dynamic and unstructured environments and novel situations is motivating the need for dexterous robot hands and grasping abilities, which can cope with the wide variety of tasks and objects encountered in such environments. Thus, we ask: ”Where is the robot that fills a shopping bag and empties it at home?” Compared to humans or primates, the ability of today’s robotic grippers and hands is surprisingly limited and their dexterity cannot be compared to human hand capabilities. Contemporary robotic hands can grasp only a few objects in constricted poses with limited grasping postures and positions.
The main objective of this workshop is to discuss the needs for the design of a robot system capable of performing grasping and manipulation tasks in open-ended environments, dealing with novelty, uncertainty and unforeseen situations. The design of such a system must take into account three important facts: i) it has to be based on solid theoretical basis, ii) it has to be extensively evaluated on a suitable and measurable foundation, thus iii) allowing for self-understanding and self-extension. As an outcome, we expect the participating researchers to provide insight in what is needed for robotic systems to reason about graspable targets, to explore and investigate their physical properties and finally to make artificial hands grasp any object. In particular, we are interested in discussing the mathematical models of uncertainty and sensors integration as well as plausible machine learning, computer vision and control strategies.
Topics of interest
  • Active vision systems
  • Detection and classification of objects in natural scenes
  • Representations of articulated and deformable objects
  • Grasping and manipulation of natural objects
  • Haptic control
  • Uncertainty in grasping
  • Sensors integration for grasping
  • Error detection and recovery in object grasping
  • Integrated objects and actions representations
  • Machine learning for grasping
  • Peter K. Allen, Columbia University, USA
  • Rod Grupen, University of Massachusetts, USA
  • Jan Peters, Max-Planck-Institute for Biological Cybernetics, Germany
  • Benjamin Kuipers, University of Michigan
  • Nancy Pollard, Carnegie Mellon University, USA
  • Erhan Oztop, ATR, Computational Neuroscience Labs, Japan
  • Matei Ciocarlie and Radu Bogdan Rusu, Willow Garage
9:00 Welcome and Introduction
Danica Kragic, Tamim Asfour and Rüdiger Dillmann
9:05 Data-Driven Grasping,   (abstract)
Peter K. Allen
9:35 Analysis of Human Grasping Using Self-Organizing Map
Q. Fu, R.P. Wong, J. Si, M. Santello
9:55 Understanding Manifolds of Grasping Actions,   (abstract)
J. Romero, T. Feix, H. Kjellström and D. Kragic
10:15 Motor synergies in grasping real and virtual objects
B. Bläsing, J. Maycock, T. Bockemühl, H. Ritter and T. Schack
10:35 Break
10:50 How Shall We Learn How to Learn How to Grasp,   (abstract)
B. Kuipers
11:20 Can we learn from biology about object representation for grasping and manipulation?   (abstract)
Erhan Oztop
11:40 Human-inspired manipulation using pre-grasp object interaction   (abstract)
L. Chang and N. Pollard
12:00 Combining Perception and Manipulation in ROS,   (abstract)
R.B. Rusu and M. Ciocarlie
12:20 Humanoid Grasping and Manipulation,   (abstract)
T. Asfour, A. Ude, N. Krueger, J. Piater, D. Kragic and R. Dillmann
12:40 Lunch
14:00 Learning Approaches for Grasping,   (abstract)
Jan Peters, Oliver Kroemer, Renaud Detry, Justus Piater
14:30 Learning Motion Dynamics to Catch a Flying Object,   (abstract)
S. Kim, E. Gribovskaya and A. Billard
14:50 Integrating Tactile Sensors into the Hands of the Humanoid Robot iCub,   (abstract)
A. Schmitz, M. Maggiali, L. Natale and G. Metta
15:10 Visually and haptically controlled skills for the dextrous manipulation of humanoid robots,   (abstract)
G. Milighetti and H.-B. Kuntze
15:30 Break
15:50 Definition of actuation and kinematics capabilities of robotic hands for grasping and manipulation of common objects.,   (abstract)
G. Palli, G. Borghesan, C. Melchiorri, G. Berselli and G. Vassura
16:10 Modeling the Role of Passive Dynamics of Hands in Grasping and Manipulation,   (abstract)
A. Deshpande
16:30 Fast and Reliable Contact Computations for Grasp Planning,   (abstract)
Y.J. Kim, M. Tang, Z. Xue and D. Manocha
16:50 Recognition and Execution of Manipulations,   (abstract)
F. Wörgötter
17:20 End
Submission of abstracts
Prospective participants are required to submit one page abstract until 07. March 2010. Please send your abstract directly to the workshop organizers asfour (at) kit.edu.
  • Danica Kragic, Royal Institute of Technology (KTH), Sweden
  • Tamim Asfour, Karlsruhe Institute of Technology (KIT), Germany
  • Antonio Morales, Universitat Jaume I, Spain
  • Ville Kyrki, Lappeenranta University of Technology, Finland
  • Markus Vincze, Vienna University of Technology, Austria
  • Rüdiger Dillmann, Karlsruhe Institute of Technology (KIT), Germany
Tamim Asfour
Karlsruhe Institute of Technology (KIT), Institute for Anthropomatics
Humanoids and Intelligence Systems Lab. IAIM Prof. Dillmann
Adenauerring 2
76131 Karlsruhe

E-mail: asfour(at)kit.edu, Web page