Einladung zum
Informatischen Kolloquium Hamburg und Cinacs-Kolloquium
Montag, 18. Januar 2010
um 17 Uhr c.t.
Vogt-Kölln-Straße 30
Konrad-Zuse-Hörsaal
Gebäude B
Prof. Dr.-Ing. Rüdiger Dillmann
Institut für Anthropomatik,
Humanoids and Intelligence Systems Lab.,
Karlsruher Institut für Technologie - KIT
Humanoid Robots Learning Sensimotor Skills and Task Knowledge from Active Observation of Human Demonstration
====================================================================================
Humanoid robot systems are designed to interact with humans in terms of observation of human activities and conversation about the task to be done and how to do the task and finally how to execute it. In addition humanoids have to act goal-oriented but have also to be capable to react on disturbances or unexpected events e.g. a context change in a competent manner. Such robot systems operate in dynamic human centered scenarios which require capabilities such as adaptivity, perception, categorisation, action and learning. Examples for such systems are humanoids that interact in a human centered environment and which cooperate with humans. The behaviour of such a robot is characterized by active observation processes, fusion of multimodal sensor data, perception, categorisation, interpretation and decision making of how to select appropriate actions and to initiate them in a superimposed control systems. Thereby learning and shaping of sensory and motoric abilities as well as the active observation and interpretation of situations and actions are of major interest.
A probate approach for learning knowledge about actions and sensomotory abilities is to acquire knowledge with the help of the observation of human activities, trying to imitate, to understand and to transfer these abilities into the memory of the robot. This requires human motion capture, observation of the physical interaction, tracking of object state transitions and observation of spatial and physical relations between objects. By doing this, it is possible to acquire so-called "skills", situative knowledge as well as task knowledge. New terms, new objects and situations, even new types of motion can be learned with the help of a human tutor and in addition existing knowledge may be modified interactively via multimodal communication channels. The term multimodality describes communication channels which are intuitive for humans, such as language, gesture and haptics including physical human-robot contact. The field of programming by demonstration has been evolved strongly as a response to the needs of generating flexible programs for humanoid robots and is largely driven by attempts of modeling human behaviour and its mapping onto humanoid robots. It comprises a broad set of observation techniques processing large sets of data from high speed camera systems, laser, data-gloves and even exosceleton devices. Some systems operate with precise a-priori models other use statistical approaches to approximate human behaviour from observations. Observation is done to identify objects, motion and action in space and time, interaction with the environment and its effects. From this observation usefull regularities, time histories, relational and situative structures and its interpretation in a given context can be derived. Some efficient systems have been developed which combine active sensing, computational learning techniques supported by multimodal dialogues. They are capable to enrich and expand the semantic system level, use episode memorisation techniques and are capable to map situation dependend strategies making use of the learned knowledge to adapt the robot controls to emerging situations. One important paradigm is that objects and action representations cannot be separated and form the building blocks for cognitive robot system behaviour. Thus, so called object-action complexes -OACS- can be derived to unify different sensor, actuator and object representations including language and allow the robot to understand ist environment.
A "Humanoid Robot" consisting of a mobile platform and a flexible torso equipped with a two-arm system with five-finger hands, a head with visual and acoustic sensors is being implemented to behave similar to a human. The human-robot cooperation requires the detection of human aims and intensions to identify the actual context referencing already memorized corporately accomplished actions and to apply this knowledge to the individual situation in the correct way. The status of the Humanoid Robot Project (SFB-588 "Humanoid Robots") is outlined and the achieved results are discussed.
Kontakt:
Prof. Dr. Jianwei Zhang
zhang@informatik.uni-hamburg.de
Tel. 42883-2431
---------------------------------------------------------------
Termine unter: http://www.informatik.uni-hamburg.de/Info/Kolloquium/
_______________________________________________
Kolloquium mailing list
Kolloquium@mailhost.informatik.uni-hamburg.de
https://mailhost.informatik.uni-hamburg.de/mailman/listinfo/kolloquium
_______________________________________________
Hiforum-verteiler mailing list
Hiforum-verteiler@informatik.uni-hamburg.de
https://mailhost.informatik.uni-hamburg.de/mailman/listinfo/hiforum-verteiler