Sidney D’Mello, University of Colorado at Boulder, shares more about his NSF AI Institute for Student-AI Teaming (#2019805).
Team: Sidney D’Mello (Principal Investigator), Martha Palmer (Co-Principal Investigator), Tamara Sumner (Co-Principal Investigator), Sadhana Puntambekar (Co-Principal Investigator), Peter Foltz (Executive Director)
The AI Institute for Student-AI Teaming (iSAT) will develop, deploy, and study AI Partners that interact naturally with students and teachers through speech, gesture, gaze, and facial expression in real-world classrooms. These AI Partners will observe, participate in, and support small groups of students to engage in learning conversations while assisting teachers in orchestrating effective collaborative learning experiences. The AI technology aims to support students to develop STEM competencies and 21st-century skills of collaborative problem-solving and critical thinking. The focal content domain of the AI-enabled curricula will be AI literacy, and how to support teachers in integrating AI education within existing STEM and literacy standards. Critically, iSAT engages diverse stakeholders—researchers, students, parents, and community leaders—in the co-design of ethical and equitable AI technologies.
What inspired this institute?
The AI in education (Ed) communities and computer-supported collaborative learning (CSCL) communities have similar aims but different foci. In contrast to the collaborative learning model, the AI in Ed community has mostly followed the intelligent tutoring model and the model of personalization. We also noticed that many excellent technologies in the AI in Ed space weren’t making it through to the collaborative learning work and that advanced AI wasn’t currently being fully leveraged within the collaborative learning space. Therefore, when thinking through our institute, we asked ourselves what we can do to reimagine AI in education within the collaborative learning space.
Another inspiration for the Institute was understanding how people communicate and interact. We wanted to shift our focus from one-on-one and unimodal interactions to studying multimodal, multi-party, multi-curriculum interactions by leveraging research on how people coordinate in small groups. When thinking about our vision of AI in Ed, we wanted to go beyond a one-on-one silent and mundane human-computer interaction focused on enhancing domain knowledge only. Instead, we were excited by a vision of a classroom as being an amazingly interactive and engaging space where people are communicating with each other and experiencing the comradery of collaborative learning. But, it’s hard to do that with a single teacher since they can’t be omnipresent to orchestrate effective collaborative learning. This led us to think about how AI can support teachers, and ultimately to our Institute’s idea of an AI Partner that listens in to, facilitates, and participates in these small group conversations to help students: 1) have more productive discussions, 2) collaboratively solve problems together, and 3) actually have fun and be engaged in the learning experience. To do that you need to be able to support multimodal (much of speech is grounded in gesture), multi-party (multiple people talking), and multi-curricula collaborative discourse. To do this in a noisy classroom environment with multiparty chatter, people moving around, technology considerations, etc, poses both technological challenges and opportunities.
How will your institute impact students?
Collaboration is a critical skill for the future workforce. Like other skills, collaboration must be learned and developed. Therefore, one focus of the Institute is on how to get teams to work together and how to intentionally help students develop collaborative skills, and more specifically, collaborative problem-solving. The Institute is supporting the use of collaboration not only as a means to achieve learning outcomes but actually helping to develop collaborative skills, which is widely recognized as a critical 21st-century skill.
This project will have a direct impact through its AI pathways for students. AI is all around us, and this project will provide students with deep AI literacy and an understanding of the power and risks of AI, as well as students’ own sense of power through knowledge. This will also make them better equipped to thrive in the future AI-driven workforce. To achieve this, the project uses a pipeline model, where they start engaging students in highly diverse districts starting in middle school through the co-design of curricula and AI technology, immersive learning experiences, and research internships.
Lastly, the Institute is intentionally developing low-cost curricular materials and technology and will make anything developed through the Institute widely available to students nationwide. All our technology is open-source for non-commercial use.
Tell us a little bit about your partnerships.
Interdisciplinarity and deep expertise at the intersections of foundational AI, learning sciences, and team science are necessary for achieving the Institute’s goals and developing the AI partner. There are nine universities working on this Institute that span 15 research areas, ranging from work on computational linguistics and distributed cognition to curriculum development in the classroom. When we formed our team, we asked ourselves who we worked with in the past that we loved working with. Core project team members then brought in collaborators that they were familiar with. It was helpful for the team to have this familiarity, especially during COVID when work was completely remote.
One of the most exciting things has been examining how you get these folks to talk together and even have the same value systems. How do you get somebody who primarily works with curated datasets to work with humans in the field? We’re experimenting with a variety of mechanisms such as design sprints, data jams, and conjecture mapping workshops as concrete ways for people to do this together. Additionally, bringing everyone together in person to work collaboratively has been critical to designing and developing the curriculum and technology.
Educator and Student Partnerships
We were able to utilize existing research-practice partnerships (RPPs) with Denver Public Schools (DPS) and a new partnership with St. Vrain Valley School District (SVVSD). These relationships have been nurtured and cultivated by team members through other current and prior NSF awards. We also strategically partnered with the inquiryHub, which is a project that helps develop curriculum that promotes inquiry and computational thinking. We weren’t able to collect data in schools during our first year (2020) due to COVID, but due to our strategic partnership, we were able to leverage an existing curriculum unit called sensor immersion that was being implemented by DPS teachers and impacted over 1000 students. Fortunately, in 2021, we worked with teachers over the summer to co-design a new curriculum unit on AI in games and conducted professional learning sessions around the implementation of a version of Sensor Immersion enhanced for collaborative learning. Then throughout the 2021 school year, more than 20 iSAT staff, students, and volunteers supported teachers in implementing the curriculum while collecting hundreds of hours of classroom audiovisual data from about 1350 students. This work was completed during the ongoing pandemic and we owe our success to these valiant efforts.
The Institute also works with and collects input from teachers on the design of the AI technology through our Teacher Advisory Board and on our AI-enabled curriculum through intensive week-long co-design sessions, which also include students.
Critically, student voice plays a central role in our work through the Learning Futures Workshops where youth are engaged to adopt an expansive vision for AI in classrooms both within and beyond the grammar of schooling. A few key things we’ve learned from youth in these workshops include: 1) youth have a need and a desire to have affirming interactions with AI; 2) they prefer that AI help them form strong relationships with their peers rather than serve as a learning guide; 3) in terms of data privacy, they would like the ability to turn the AI off and negotiate what information is shared with the teacher; 4) they’re willing to trade-off what information they’re willing to share for features they value. This information is being used to inform the design of the AI partners resulting in novel designs including a Community Builder (CoBi) which helps students and teachers develop and adhere to mutually-agreed collaborative agreements (e.g. respect, equity, advancing learning), a collaborative co-pilot which supports individual student groups, and an augmenter which aggregates and distills information from the groups to support teacher orchestration. We’re planning on having prototypes of these technologies tested in the 2022/2023 school year.