overview
people
research
publications
facilities
links
site credits

 

 

Gestural Communication
in Collaborative Physical Tasks

Susan R. Fussell, Jie Yang, Jane Siegel, Co-PIs

Human-Computer Interaction Institute, Carnegie Mellon University


As the workforce becomes increasingly distributed across space and time and increasingly mobile, the need to collaborate with remote partners to accomplish collaborative tasks has risen substantially. Collaboration between two or more people is now supported by a wide variety of technological tools. Most systems to date, however, are designed to support group activities which can be performed without reference to the external spatial environment. Development of systems to support collaborative physical tasks, in which two or more individuals work together to perform actions on concrete objects in the three-dimensional world, has been slower. Such tasks play an important role in many domains, including education, design, industry, and medicine. For example, a remote expert might guide a worker's performance of emergency repairs to an aircraft, a group of students might collaborate to build a science project, or a medical team might work together to save a patient's life. Because the expertise required to perform collaborative physical tasks is becoming increasingly distributed, there is a critical need for technologies to support their remote accomplishment.

Observational studies of physical collaboration suggest that people's speech and actions are intricately related to the position and dynamics of objects, other people, and ongoing activities in the environment. As they speak, people use several types of gestures that may clarify or enhance their messages. Pointing gestures are used to refer to task objects and locations. Representational gestures, such as hand shapes and hand movements, are used to represent the form of task objects and the nature of actions to be used with those objects, respectively. In face-to-face settings, people can make full use of both pointing and representational gestures because they share a physical environment that includes both task objects and other participants. Providing technological support for remote gesture is made complicated by the different visual requirements for pointing and representational gestures—pointing gestures require a view of the target object whereas representational gestures require a view of the speaker's hands. In addition, technologies to date to support remote pointing require users to control the pointer using devices such as mouses or joysticks, which restrict their hands from performing other gestures.

The focus of our research is on the ways people communicate through speech and gesture as they perform collaborative physical tasks. Our research is guided by the assumption that the success of systems designed to support remote collaboration on physical tasks will be strongly influenced by the extent to which these systems enable people to communicate about their physical world with the same ease as they can do so when co-located.

The research has three broad aims:

  • to increase our understanding of the types of gestures used in conversations about physical objects in face-to-face settings and the impact of these gestures on communication and task performance
  • to systematically evaluate the impact of alternative methods for implementing gesture in systems for remote collaboration on physical tasks
  • to develop and test a prototype system that is intended to enable remote communicators to produce and interpret both pointing and representational gestures as easily as they do in face-to-face settings.
A closely inter-related series of laboratory experiments, field studies, and technology development efforts are proposed to meet these aims. Taken together, these research activities should help improve the design of systems for remote collaboration on medical, educational, vehicular repair, and other physical tasks.


This work is funded by the National Science Foundation, grant #0208903. The opinions and findings expressed on this site are those of the investigators, and do not reflect the views of the National Science Foundation.