NSF logo

EXP: Exploring augmented reality to improve learning by deaf children in planetariums: 1124548

Principal Investigator: Michael Jones
CoPrincipal Investigator(s): Fred Mangrubang, Eric Hintz, Ron Proctor
Organization: Brigham Young University

Abstract:
This project is investigating the use of head-mounted augmented reality (AR) to improve learning outcomes among deaf and hard of hearing learners in situations that make learning logistically-challenging for them, specifically presentation situations where there is also some scenario that needs to be focused on visually. The work is being carried out in planetaria, where learners wear a monocle that displays a signer in a way that allows the learner to look at both the signed interpretation of the presentation and the scenario of interest at the same time. The design of the technology and way it is being used is informed by the literature on cognitive load and by literature on multimedia learning theory (Mayer, 2005). Results are applicable to a wide variety of logistically-challenging situations for deaf/hoh learners, including the kinds of informal learning venues that often excite the passions of hearing learners and perhaps in classrooms as well.

Presentations, even when a signer is available, are often logistically-difficult for the deaf and hard-of-hearing population to take advantage of well. Moving attention back and forth between the interpreter to the objects or scenarios being described makes it difficult to follow a presentation and get everything out of it that a hearing person can get. This project is aiming to ameliorate this problem by designing technology that will project the interpreter’s signs in the same field of vision as the object or scenario being discussed and learning how to use that technology well.

Tags: ,