Considering Techquity in the Classroom

By Merijke Coenraad

Merijke Coenraad is a PhD Candidate in the Department of Teaching & Learning, Policy & Leadership in the College of Education at the University of Maryland. She is a former middle school teacher. Her research focuses on the intersections of educational technology and equity including the creation of materials, platforms, and experiences in partnership with teachers and youth through participatory design methods.

Flashback to a Spanish Classroom (2016)

Chromebooks out. Hushed silence. Each student leaned over their computer. Tension in the air. I yell, “GO! “ and with one word, the room erupts and groups hurriedly work together to identify vocabulary words before their classmates. In loud whispers students ask their partners for words, “Calcentines, who has socks?” One mistake and the group will have to start over; the stakes are high, and no star student can single handedly win the game for their peers.

Quizlet transformed flashcards, a time consuming (and often lost or forgotten) physical learning tool into a digital learning experience. My students practiced their vocabulary words through drills and games all week and on Friday, we played Quizlet Live.

When I was still in the classroom, I loved to bring new technology into my social studies and Spanish lessons. I got excited discovering tools like EdPuzzle and Padlet when they were first breaking onto the education stage. With 1 to 1 Chromebooks in my middle school classroom, there was hardly a class period where students were not somehow connected to technology and each of these technologies meant creating a new account. Looking back, I realize that I was naïve while teaching. As I brought tool after tool to my students, I didn’t think deeply about the data collection ramifications and the way that the very tools that could enhance learning might be treating my students inequitably and perpetuating the structural racism and human biases that I worked each day to dismantle. The educational technology that I brought into my classroom had positive effects, but it also had hidden consequences, most of which I might never know.

Four years after leaving the classroom to begin my PhD, my work focuses on one thing, Techquity, or the intersection of technology and equity. This focus is driven by the students I taught and the many times I saw technology act as both an access point and a barrier to their education. Even though I wasn’t thinking about data collection, algorithmic bias, and the effects of AI for the students in my classroom, I was still focused on how technology helped and hindered my students’ education. But those barriers and hindrances go beyond the devices and internet access I have long considered. In the last year, I have learned a lot about forces within and around technology that cause inequities. I have learned about the Coded Gaze of AI Technologies from Joy Buolamwini and the New Jim Code from Ruha Benjamin. I’ve learned about the biases inherent in the very design of technologies with Sara Wachter-Boettcher and how algorithms can be Weapons of Math Destruction from Cathy O’Neil. It has led me to focus on how I can not only be more cognizant of the biases of technology, but also teach students about them.

Techquity: Co-designing with Kids

To learn more about what kids think about Techquity concerns, I partnered with a youth design team to hear what they had to say about Techquity and learn which Techquity concerns were of the most interest to them. I find that kid insight is critical whenever I am discovering new topics to teach to students. The team was constructed of 7 Black youth between the ages of 8 and 13 who meet twice a week to design technologies and learn about being a designer.

Let’s look a little bit at what the kids had to say about Techquity.

While they didn’t have the vocabulary to name algorithmic bias or biases in voice recognition technology, the kids quickly began offering examples of how technologies can be good and bad and how even single technologies can have good and bad sides. For example, one group identified Siri as helpful because “she” can give information without typing, but they also were worried that Siri doesn’t always understand them and “SIRI CAN LISTEN TO US!!!!” While the AI in their phones allowed the students to access all sorts of information, they were not immune to considerations of what it meant for a device to always be listening for, “Hey Siri…”

As our conversation turned and I introduced the kids to some common examples of Techquity concerns such as data collection, targeted advertising, misidentification by AI, and non-diverse tech design teams, the kids continued to describe their own examples. They could recollect times when they received targeted advertising based on location or a recent website visit.

Techquity Concerns

10 common Techquity concerns we discussed are:

  • Algorithms (computer programs) don’t treat everyone fairly
  • Technology development teams are frequently not diverse
  • Alexa, Google Home, and Siri are always listening to me
  • I get personalized ads based on data companies collect about me
  • Technology is not always accessible for individuals with disabilities
  • Companies sell my data
  • Sensors and systems like Alexa, Google Home, and Siri get confused about how I look or what I say
  • People don’t understand how technology works
  • Machine learning and facial recognition isn’t trained well enough to recognize everyone

The kids each ranked the 10 Techquity concerns from “very important to me” to “not very important to me.” The two most highly ranked ideas were algorithmic bias and non-diverse tech companies. The kids were especially concerned that individuals who looked like them were not being represented on design teams when they themselves were and what this meant for the technologies being designed.

As their final design task, the kids designed ways to teach other kids about Techquity by drawing their ideas out on an online platform mimicking paper and pencil. Interestingly, the kids didn’t want to move away from technology just because it could be biased, they just wanted it to be created in more equitable ways and to be used to teach others. Their teaching often included advanced algorithms and even AI. They designed scenarios using robots and adaptive software to allow other kids to experience obvious Techquity concerns and learn from their experiences. One girl, Persinna, explicitly discussed the three-member design team shown in her game as having 2 girls and 1 boy because “that is Techquity.” Kabede felt very strongly that data collection by tech companies was a big concern. He started making connections to actual tools he knows such as DuckDuckGo, a search engine that does not profile users and focuses on user privacy.

What I Would Consider Now If I Were Still a Teacher

I’d start from what these kids already know about Techquity and how algorithms and AI are affecting their lives and build on that. I would educate students about the biases inherent in Google searches, which sort not by popularity of links as is commonly assumed, but based on user profiles and advertising. I would use Kabede’s recommendation and have students use a search engine like DuckDuckGo to prevent tracking and allow for private searches. I would challenge students to think about where algorithms, AI, and technology design are already affecting their lives and how technologies might work better for some individuals than they do for others. We would talk about the sensors in automatic sinks, paper towel dispensers, and medical devices and how those sensors work based on light, but oftentimes work better for people with lighter skin. We would discuss Joy Buolamwini’s experiences and work and talk about how machine learning training sets are often not adequate to identify all people well and how this has direct consequences for the use of AI in policing and surveillance.

While the students in my classroom wouldn’t be the ones causing the technology bias, I would make sure they were aware of it and how it had direct implications for their lives. Most of all, I would base these discussions in students’ lived experiences. Just like the kids on the design teams, it is inevitable that my students experienced technology bias, they just might not have had words for it or known why it was happening. The more I could teach my students and bring Techquity concerns to their knowledge, the more they could protect themselves (and their communities) and make educated decisions about their lives with technology. I know that my middle school students wouldn’t give up their technology and knowing about the biases held by the designers of that technology probably wouldn’t change their opinions of technology being, as Joshua said in the design session, “the best thing ever,” knowing more about their digital footprint and how companies are using their information gives them a small advantage. In this case, knowledge of Techquity concerns could give them power over their data and their technology use.

Educator CIRCLS posts are licensed under a Creative Commons Attribution 4.0 International License. If you use content from this site, please cite the post and consider adding: “Used under a Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).”
Suggested citation format: [Authors] ([Year]). [Title]. Educator CIRCLS Blog. Retrieved from [URL]