NSF logo

Collaborative Research: Teaching Human Motion Tasks at Population Scale: 1822819

Principal Investigator: Devin Balkcom
CoPrincipal Investigator(s): Xia Zhou, David Kraemer
Organization: Dartmouth College

Abstract:
The project will develop technology and study methods for teaching motion tasks, with the teaching of sign language as a first application. Simultaneous placement or quick movement of parts of the body is hard to observe, explain, and execute. While tactile-sensing and augmented-reality systems have been developed to enable machine-human communication of physical processes, the focus has largely been on execution, rather than on teaching and learning. The initial focus of the project will be on teaching sign language, but principles and techniques discovered will be generalized to research the learning of increasingly complex physical motions, ranging from simple posing tasks to high-speed fine manipulation tasks. The proposed work is transformative in that it will directly address the scientific question of how to use technology to understand correct or incorrect human motion and provide constructive guidance, leading to a better understanding of human motion learning. The project will disseminate findings and resources through traditional scientific publications. In addition, models, algorithms, and designs for rapidly-prototyped tools for manipulation will be made available on the online. Results will also be communicated broadly through collaborations with local high schools and museums, and through participation in events such as the USA Science and Engineering Festival.

The task of teaching motion motivates the research of three fundamental challenges. First, closed-loop control is a core feature of cyber-physical systems. With a human participant in the system, how can the loop be closed around slow and low-bandwidth human attention? Actuation that guides the human must be easily communicated and sufficient to stabilize the human-suit system. Second, due to limitations in how much information may be communicated, complex human motions must be broken down, and components taught in isolation. How can these component motions be discovered, taught, and re-integrated? Third, algorithms and systems must be developed to measure the accuracy and retention of the learner during the teaching process, guiding repetition and selection of practice material. The project will design and build a lightweight sensing and guidance system that allows interactive communication about motion between human and computer. This technology will allow the investigators to address fundamental research questions in cyber-learning about how to better teach and learn human motion tasks. Research questions include how to measure and evaluate human motion with respect to the task, how to select sensory input to use as guidance, and how to selectively apply or remove training aids, until the learner can complete the motion task with no assistance.

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

Tags: , , , , , ,