NSF logo

DIP: Collaborative Research: Interactive Science Through Technology Enhanced Play (iSTEP): 1628918

Principal Investigator: Joshua Danish
CoPrincipal Investigator(s):
Organization: Indiana University

Abstract:
The iSTEP project addresses a basic research question by exploring the role of the body and physical activity in learning through the design of a new genre of developmentally appropriate learning technologies for young children. There is increasing recognition that the body plays a role in cognition: human beings, especially young children, understand complex concepts in part by relating them to how we move our own bodies. In extending these ideas, the iSTEP project also aims to develop teaching techniques and technological tools that can be used in real classrooms in the near future. Instead of supporting the learning of individual students as many current technological advances attempt to do, the iSTEP project creates opportunities for entire classrooms of students to engage together to model scientific phenomena using their bodies. For instance, a classroom of students can use their own bodies to model how a state of matter such as liquid is made up of many moving particles and technologically enhance this activity to improve learning. The iSTEP project builds upon the already successful STEP mixed reality platform (IIS-1323767) by adding new forms of interaction – the use of gestures and physical props to control the STEP computer simulation. Adding these new forms of interaction allows us to examine their role in supporting learning.

The existing open source STEP platform uses commercially available vision-based sensors to track the motion of up to 12 children in an 8m x 8m space. The children simply walk into the space, are assigned an avatar (i.e., they become a water particle), and that avatar follows them as they move around the room. The children’s avatars are then immersed in a virtual simulation that is programed to mimic the scientific concept they are learning. In this case, the state of matter of water (e.g., solid, liquid, or gas) is determined by how fast the children move and the relative distance between them. This allows the students to discover the laws that govern state changes through their collaborative activity. Students can also use the PLAE interface (IIS-1522945) to annotate the simulation and create representations of their peers’ activity, helping them all to reflect on the underlying principles inherent in the system. In iSTEP, students will now also be able to control the simulation by gesturing, posing with their whole body, and by manipulating physical objects in addition to the previous model of interacting with their entire body. In addition, by using smart watches, the project will explore alternative forms of feedback to the students as they can feel vibrations, hear sounds, and even see simple images that are targeted to help them explore the simulation. In the first round of experiments this project will contrast the gesture (and pose) interface with a new interface that uses physical props to see how each contributes to student learning. In the final year, these will be integrated to develop deeper insights into how they can best be used to support the design of learning environments that build on mixed reality systems.

Tags: , , ,