The NSF AI Institute for Adult Learning and Online Education (AI-ALOE)

We spoke to Chris Dede, AI-ALOE Co-Principal Investigator and Senior Research Fellow at Harvard Graduate School of Education, and summarize our conversation with him about the vision for the recently funded NSF AI Institute for Adult Learning and Online Education (ALOE) (NSF Award #2112532).

PI Team: Myk Garn, Principal Investigator, University System of Georgia; Ashok Goel, Director and Co-Principal Investigator, Georgia Institute of Technology; Scott Crossley, Co-Principal Investigator, Georgia State University; and Alexander Endert, Co-Principal Investigator, Georgia Institute of Technology, and Chris Dede, Co-Principal Investigator, Harvard Graduate School of Education

What is the big idea of your project?

The NSF AI Institute for Adult Learning and Online Education (AI-ALOE) aims to advance the workforce through higher and continuing education and adult education programs. Central to our work to create personalized learning technology is research and data collection in order to explore new AI theories, techniques and models of lifelong learning while we evaluate effectiveness, track results and study outcomes. No two adult learners (age 24 and up) process information the same way and may find different learning techniques or styles useful. Which is why the work of AI-ALOE is crucial. Our work will create AI teaching assistants to support adult learners.

Personally, I (Dede) have been long interested in adaptive learning, not just like an intelligent Tutoring System, but personalized so learners have voice and choice and that choice can include learning alone or learning in a social setting with other learners and teachers. Voice and choice can be very broad and we need to leverage learning analytics to really personalize online learning. As mentioned before, the institute is thinking about teaching assistants, and since we’re in the adult learning space, where self-directed learning occurs, in those situations, it could be that a teaching assistant becomes an assistant to the self-directed learner.

Another component of the Institute being explored is mutual theory of mind by the Georgia Tech Design & Intelligence Lab. Humans have the ability to develop a model of what another person might be thinking given what they know about the person. A person might adjust their interactions with another to help make a situation work more smoothly, but how does an AI system know how to work with people? Increased understanding in this area is fundamental to creating the specific tools the institute wants to develop.

Tell us a little bit about your partnerships.

This research initiative involves a cross-sector and cross-disciplinary collaboration in order to represent the many different types of adult learners and types of learning they have to do. Currently, our team is actively engaging with learners and instructors at Georgia Tech, Georgia State University, and Technical College System of Georgia.

The institute has a powerful partnership with the Technical College System of Georgia, an advanced vocational technical institution to help us not only understand who adult learners are but their needs. They have so many students in our target population, many who have been marginalized over the years, and we can learn what they need to do, and how new tools can work or help what they need and want to learn. Our partners are taking existing courses and applying AI tools to the current content of their curriculum.

Furthermore, these academic institutions are not only the homes of researchers involved but also sites where evaluation of the work can take place. Additionally, our industry partners will also be test beds. IBM and Boeing employ adult learners and publishing companies like Wiley design products to help adult learners. We look forward to learning from and collaborating with our current and future partners. Through our collaborative partnerships, we aspire to avoid reinventing the wheel and address the different facets of motivation and learning platforms available for adults.

We would also encourage the other NSF AI institutes, the AI in Education community, other related research groups and potential partners to learn more about using AI in different situations. Making connections between the work we are all doing with AI systems will help it grow.

List of partners:

Non-profit organizations

  • Georgia Research Alliance
  • 1EdTech

Academic

  • Georgia Technology
  • Georgia State University
  • Technical College System of Georgia
  • Harvard University
  • University of North Carolina at Greensboro
  • Arizona State University
  • Vanderbilt

Industrial

  • Boeing
  • IBM
  • Wiley

*Accenture, a multinational consulting company, has partnered with the NSF to fund the Institute.

What do you envision is the impact your project will have on teaching and learning with AI?

The institute is not only focused on improving AI in education but also doing things better, such as emphasizing guided learning by doing with less emphasis on teaching by telling — we need to empower self-directed learning. We’re learning a great deal from Dr. Ruth Kanfer’s work on how adult learners change through life and what is lost but also what is gained as one gets older. We also believe instead of teaching adults a skill they need “tomorrow,” give them learning that will be continuous throughout their entire career and life. We want to give the adult learner an understanding of what they bring to a partnership with AI, and how important they are to humanizing decision-making in machine learning. In this way, the institute is hoping to shift the discourse from “AI taking over” to one of judgment, to one that humans are doing the work. There’s a lot of potential for AI to empower and augment human abilities and that would have a huge impact on the future of teaching and learning. We are committed to an Ethics Plan that ensures all AI technologies developed by the ALOE Institute will be designed in a way that takes the well-being of users, unintended users, and stakeholders into account.

Lastly, the core of our work does not solely come from our partnerships, research, data and programs but participatory design. This helps us design better systems. Privacy, security, and bias, are all potential weaknesses of AI and so our thinking is that one way to mitigate those risks is very rich interaction with people who will be giving consent for their data.

Thank you Chris Dede for speaking with CIRCLS and we look forward to hearing more from you and your colleagues as the Institute progresses.