Humanoids
16-864
Tuesdays 6-9pm, NSH 3002
Spring 2001


Readings


For Jan. 16: Organizational meeting: None.


For Jan. 23: Legged locomotion:

Pratt, Jerry, Pratt, Gill 1999. Exploiting Natural Dynamics in the Control of a 3D Bipedal Walking Simulation. Proceedings of the International Conference on Climbing and Walking Robots (CLAWAR99), Portsmouth, UK, September 1999. Available from www.ai.mit.edu/projects/leglab/publications/publications.html

The Development of Honda Humanoid Robot, K. Hirai, M. Hirose, Y. Haikawa, T. Takenaka, ICRA 1998, pp. 1321-1326. Available as scanned in PDF (in low resolution IEEE format) or as hardcopy outside Hodgin's office NSH 4203 on shelf next to NSH 4203 sign.


For Jan. 30: Legged locomotion II:

J.J. Kuffner, S.Kagami, M. Inaba, and H. Inoue. Dynamically-stable motion planning for humanoid robots. In Proc. of 1st Int. Conf. on Humanoid Robotics (Humanoids'00), Boston, MA, September 2000. Available from publications page of www.jsk.t.u-tokyo.ac.jp/~kuffner/humanoid

Waseda Biped Humanoid Robots Realizing Human-like Motion.

Here are some more Waseda papers you might want to take a look at. Control to Realize Human-like Walking of a Biped Robot


For Feb. 6: Primitives:

Automated Derivation of Primitives for Movement Classification, Fod, Mataric, and Jenkins. IEEE-RAS International Conference on Humanoid Robotics (Humanoids-2000), MIT, Cambridge, MA, Sep 7-8, 2000. This paper and additional papers available from this page.

Nonlinear Dynamics Systems as Movement Primitives, Schaal, Kotosaka, and Sternad, International Conference on Humanoid Robotics. Cambridge, MA, Sept 6-7, 2000. This paper and additional papers available from this page. More info on the robot used in this work is available from this page.


For Feb. 20: Motion Capture:

Zoran Popovic and Andy Witkin, "Physically Based Motion Transformation." in Computer Graphics (SIGGRAPH) 1999


For Feb. 27: Emotions:

We will talk about Kismet. I suggest we discuss both Breazeal, C. and Scassellati, B. (1999), "How to build robots that make friends and influence people". IROS99, Kyonjiu, Korea. and Breazeal(Ferrell), C. and Scassellati, B. (2000), "Infant-like Social Interactions Between a Robot and a Human Caretaker". To appear in Special issue of Adaptive Behavior on Simulation Models of Social Agents, guest editor Kerstin Dautenhahn. (both papers available from this page). There are some relevant abstracts as well on this page.

We will also look at the WAMOEBA Project as a case study. The paper we will read is Emotional Communication Robot: WAMOEBA-2R -- Emotion Model and Evaluation Experiments, Ogata and Sugano, Humanoids 2000


For Mar. 6: Motion Capture:

Playter: Physics-Based Simulations of Running using Motion Capture

Anderson & Pandy: A Dynamic Optimization Solution for Vertical Jumping in Three Dimensions.


For Mar. 13: Faces:

Cohen, M.M., Beskow, J., and Massaro, D.W. (1998). Recent developments in facial animation: An inside view. AVSP '98 (Dec 4-6, 1998, Sydney, Australia).

A handy overview

A handy overview of robot heads

Hara and Kobayashi, movie?

iRobot baby face

Software: Baldi, Face works Computer Facial Animation, Parke and Waters


For Mar. 20: Recognizing Human Activity:

Take a look at The Visual Analysis of Human Movement: A Survey, D. M. Gavrila, Computer Vision and Image Understanding, Vol. 73, No. 1, Jan 1999, pp. 82-98. (Search by Author or title on this website)

You can also read Human Motion Analysis: A Review, J. K. Aggarwal, Q. Cai, Computer Vision and Image Understanding, Vol. 73, No. 3, Mar 1999, pp. 428-440. (Search by Author or title on this website)

For an indepth look at one approach, you can look at the following: Model-based estimation of 3D human motion, Kakadiaris, I.; Metaxas, D., IEEE Transactions on Pattern Analysis and Machine, Intelligence, Volume: 22 Issue: 12 , December 2000, Page(s): 1453-1459. available from IEEEexplore


For Apr. 3: Interaction:

Todd will present:
PRoP: Personal Roving Presence Eric Paulos and John Canny
This paper and related papers are available from this link

Chris will present:
Gordon Cheng, Akihiko Nagakubo, Yasuo Kuniyoshi "Continuous Humanoid Interaction: An Integrated Perspective - Gaining Adaptivity, Redundancy, Flexibility - In One" Proc. of First IEEE-RAS International Conference on Humanoid Robots, Boston, USA, Sept 7-8, 2000
This paper and related papers are available from this link


For Apr. 10: Conversation:

Chatterbots
Take a look at www.simonlaven.com Click on the "Chatterbot Papers" box, and then on the "AIML and Alice" entry. Unfortunately, the official Alice site seems to be down.

Synthetic Interview:
Overview (Word format)
How it works (Word format)
How to make one (Word format)


For Apr. 17: Multi-modal interaction:

Soshi Iba will present "Robot Programming Through Multi-modal Interaction"

A copy of the thesis proposal is available at: http://www.cs.cmu.edu/~iba/research/proposal.pdf

As robots enter the human environment and come in contact with inexperienced users, they need to be able to interact with users in a multi-modal fashion - keyboard and mouse are no longer acceptable as the only input modalities. Humans should be able to communicate with robots using methods as similar as possible to the concise, rich, and diverse means they use to communicate with one another.

The goal of the proposed thesis is a comprehensive multi-modal human-machine interface that allows non-experts to conveniently compose robot programs. Two key characteristics of this novel programming approach are that the system can interpret the user's intent, and that the user can provide feedback interactively, at any time. The proposed framework takes a three-step approach to the problem: multi-modal recognition, intention interpretation, and prioritized task execution. The multi-modal recognition module translates hand gestures and spontaneous speech into a structured symbolic data stream without abstracting away the user's intent. The intention interpretation module selects the appropriate primitives based on the user input, current state, and robot sensor data. Finally, the prioritized task execution module selects and executes primitives based on current state, sensor input, and the task given by the previous step. Depending on the mode of operation, the system can provide interactive robot control, adjustment and creation of primitives, and composition of robot programs. The proposed research is expected to improve significantly the state-of-the-art in industrial robot programming and interactive personal robotics.


For Apr. 24: Reinforcement Learning:

Hierarchical decomposition and min-max strategy for fast and robust reinforcement learning in a real environment

Jun Morimoto

We propose a method for applying reinforcement learning to a high-dimensional state space. We introduce a hierarchical architecture to the reinforcement learning. We apply the proposed method to a stand up task with a 3-link 2-joint robot. In that task, the robot learn to stand up from a lying state to an upright state through trial and error. We also propose a method to realize a robust policy by using reinforcement learning. The idea originally comes from H-infinity control theory. We show robustness of the policy acquired by our proposed method in a single pendulum swing up task.

More information is available from www.erato.atr.co.jp/~xmorimo


Syllabus



Instructors:

Chris Atkeson
The best way to contact me is to use email:
cga@cmu.edu

Jessica Hodgins
The best way to contact me is to use email:
jkh@cs.cmu.edu