Ha, speaking of Honda!  (They fund this lab)

This place is *great*.  It's in the middle of nowhere, yet they get 
unbelievably good people passing through.

Joanna

---------- Forwarded message ----------
Date: Thu, 24 Apr 2008 13:33:26 +0200
From: Carola Haumann <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
Subject: [robotics-worldwide] 2 Scholarships for Postdocs at CoR-Lab /  Bielefel
     University

Apologies for multiple postings

2 Scholarships for Postdocs at CoR-Lab / Bielefeld University

The CoR-Lab has been established at Bielefeld University, Germany, as a research
centre for intelligent systems and human-machine interaction. The CoR-Lab forms
a strategic partnership between Bielefeld University and the Honda Research
Institute Europe GmbH, Germany. It pursues fundamental research in the field of
cognitive robots and intelligent systems, where the Honda humanoid robot ASIMO
is available as an advanced technological platform. A particular focus of the
CoR-Lab is the interdisciplinary integration of expertise in engineering,
computer science, brain science, and cognitive sciences, including the
humanities and social sciences.

The Graduate School that is associated with the CoR-Lab provides an exciting and
stimulating environment for enthusiastic and creative postdocs, allowing them to
pursue research in international teams in close collaboration with an industrial
research institute. The CoR-Lab Graduate School offers 2 scholarships for
postdocs. We invite applications from researchers holding an academic degree
(Dr./Ph.D.) and meeting the qualifications listed below in detail for both
positions. Fluency in English is required.

A complete application should include certificates and transcripts of records of
the completed course of studies, a CV, a cover letter providing information
about the qualification and the motivation to do research in the Graduate
School, as well as a short description of the research interests with regard to
one of the following two projects:

***************************************************************

*Implicit semantic transmission in social learning Analysis and modeling

The social context of learning has increasingly gained attention in
developmental psychology, cognitive science and robotics.  It has been proposed
that an agent - in order to learn - needs to be grounded in a meaningful
embodied activity. The robotic research has just started to benefit from the use
of developmental approaches: Orienting towards 'learning by communicating'
offers new learning paradigms, within which it can be analyzed how semantic
information is transmitted, and which effect the way of transmission has onto
learning. So far this paradigm involves face-to-face scenarios, where a tutor is
focusing on a student. However, this learning situation is not offered in every
culture. Instead, developmental research has shown that children are likely to
benefit also from other scenarios. Motivated by animal studies by e.g. Irene
Pepperberg on grey parrots which were trained in a social learning paradigm
(model-rival-paradigm), it is our goal to investigate multi-party learning
scenarios, in which the tutor does not address the student directly but the
student is learning while observing a tutoring behaviour towards another person.
Thus, our assumption is that learning can take place from both, direct and
indirect teaching.

With this project, we will investigate the behaviour of tutors and students and
study the achieved learning effects in different situations of social learning.
Based on the data gathered in psychophysical experiments on both, direct and
indirect teaching scenarios, we aim to identify different verbal and non-verbal
patterns, e.g. denominating objects, showing an object. Following the
identification and classification of these patterns, we aim to develop a
generative model for their production. The purpose of this model is twofold.
Firstly, it will allow setting up a virtual tutor. A virtual tutor can be used
to create simulated dialogues with the virtual tutor replacing the real tutor or
tutors and an additional avatar, which replaces the child. Secondly, building a
generative model for the behaviour of the tutor will allow us to understand the
underlying principles of learning in a social context better and the insights
from the modelling will provide valuable feedback on the design of the
psychophysical experiments.

The results of this research should enable the setup of a social interaction
simulation environment, where reproducible experiments between tutor avatars and
a robotic artefact could be performed. These experiments will allow testing new
hypotheses on how social learning takes place.

***************************************************************

* Autonomous Exploration of Manual Interaction Space

We gradually increase our manual competence by exploring manual interaction
spaces for many different kinds of objects. This is an active process that is
very different from passive perception of "samples". The availability of
humanoid robot hands offers the opportunity to investigate different strategies
for such active exploration in realistic settings. In the present project, the
investigation of such strategies shall be pursued from the perspective of
,,multimodal proprioception:" correlating joint angles, partial contact
information from touch sensors and joint torques as well as visual information
about changes in finger and object position in such a way as to make predictions
about "useful aspects" for shaping the ongoing interaction.
To make this very ambitious goal approachable within the resource bounds of a
single project, we will focus on an interesting and important specific case of
manual interaction spaces: ,,visually supervised object-in-hand manipulation".
More particularly, one could consider rotating an object, e.g. a cube, within
the hand such, that certain faces become visible one after the other.
This project crucially involves the need to combine visual information with
proprioceptive feedback when the fingers explore the faces and edges of the
object. A major goal of the project would be to implement a "vertical slice" of
explorative skills, ranging from low level finger control and visual perception
within an object category, chunking a limited set of action primitives, and
planning short action sequences.
Generic insights should be about how visual and haptic information has to be
combined to drive the exploration process and about suitable principles for
shaping the exploration, such as reinforcement learning, active learning driven
by information maximization, imitation of previously learnt episodes (instead of
statistical learning).

Research experience in one or more of the areas visual perception, robotics
control, reinforcement learning, active learning, and neural networks is
appreciated.

***************************************************************

For more information please see:
http://www.cor-lab.de/corlab/html/graduate_school/index.php

Please send your application until 13 May 2008 (preferably in PDF format) to the
Managing Director of the Graduate School:

email: [EMAIL PROTECTED]

Bielefeld University
CoR-Lab Graduate School
Dr. Carola Haumann
33594 Bielefeld
Germany


_______________________________________________
robotics-worldwide mailing list
[EMAIL PROTECTED]
http://duerer.usc.edu/mailman/listinfo/robotics-worldwide




_______________________________________________
BAI mailing list
BAI@lists.cs.bath.ac.uk
http://lists.cs.bath.ac.uk/mailman/listinfo/bai

Reply via email to