NSF logo

Combining Smartphone Light Detection and Ranging with Augmented Reality to Enhance Position-Based Teaching and Learning in STEM: 2114586

Principal Investigator: Colleen Megowan-Romanowicz
CoPrincipal Investigator(s): Mina Johnson-Glenberg, Rebecca Vieyra, Chrystian Vieyra, Daniel O’Brien
Organization: American Modeling Teachers Association
NSF Award Information: Combining Smartphone Light Detection and Ranging with Augmented Reality to Enhance Position-Based Teaching and Learning in STEM
Understanding how to measure, display, and interpret motion is important for many STEM-related careers, particularly in the physical and data sciences. Educational researchers have advocated for numerous approaches to support sense-making with mathematical models of motion, but teachers often struggle to enact them due to limited resources. This project will make high-precision position sensing a reality for anyone who owns a smartphone by building on light-based mobile sensors (LiDAR) that are able to detect one’s distance from objects and location within a space. The educational research will measure the effect of using this new technology to improve student learning and engagement with regard to mathematical models with motion graphs, by producing a classroom-ready application and gamified lessons for teachers and students to use in traditional classrooms as well as the home.

Researchers and educational software developers will develop new data visualization technology based on iOS’ scanning LiDAR and Android’s time-of-flight depth imaging. The proposed technological innovation will make use of the novel back-facing infrared beam array to significantly increase precision in position measurements and the placement of augmented reality (AR) visualizations based on users’ movements and environmental data. This project will determine the extent to which LiDAR-aided AR technology can enable high-precision, position-based, and real-time data visualization. It will explore how the new technology can provide the kind of cognitive scaffolding and embodied experiences needed for advancing teaching about modeling motion with graphs and vectors. Research in the learning sciences will entail a collaboration with STEM educators to develop and test the effectiveness of scenarios for exploration in traditional and remote learning contexts. This proposal will assess full-body movement to make sense of motion graphs with a focus on embodied learning and practice with data visualization literacy.

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

Tags: , , , , ,