Principal Investigator: Shuchisnigdha Deb
CoPrincipal Investigator(s): Yiran Yang, Amanda Olsen
Organization: University of Texas at Arlington
NSF Award Information: Enhancing Active Learning in Additive Manufacturing Using a Bilingual, Assisted Virtual-Reality Platform
Technology is integrated into every aspect of today’s world; therefore, the education system needs to train students in the use of technology. Virtual reality is one method that can be incorporated as a part of the curriculum, as an instructional delivery system, an instrument to enhance the learning process, and a tool for evaluation. This project focuses on additive manufacturing and will leverage emerging technologies by using virtual reality to develop a bilingual (English/Spanish) immersive learning environment for engineering students. All students, including students with disabilities, will be given access to the cutting-edge learning modules within virtual environments. Additive manufacturing technologies play a critical role in future manufacturing; however, they pose high-level safety-related threats to workers and environments. Hence, workers who need to directly or indirectly work with these technologies must be professionally trained with hands-on experience to gain the specific certified skills needed. The learning platform developed during this project will transform traditional approaches of learning and teaching, and improve engineering education in additive manufacturing.
The proposed project includes four distinct activities: (i) development of a virtual reality learning platform, (ii) design of course modules, (iii) development of software to track students’ interactions, and (iv) deployment of the developed software in the target courses. The learning platform will be a virtual additive manufacturing lab equipped with different types of 3D printers, a computer workstation, a station for hand tools, and a station for personal protective equipment. Students will be assigned operational or safety training projects, based on the printer chosen, with instructions to complete tasks in sequence. After the development of the learning platform and course modules, a pilot study will be performed to collect data on student interactions with the course modules. Students’ facial expressions and eye movements will be collected in real-time along with their interactions with the learning platform and course materials. Data from this study will enable us to develop a model to design assistive functionalities within the virtual platform. This model can create useful feedback and additional instructions or question students’ selections, substitute instructors’ supervision, and provide assistance. The outcomes of this project will be a novel bilingual learning and teaching platform with real-time assistance to significantly enhance student engagement and performance in active learning.
This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.