麻豆影院

Skip to main content

iSAT Curriculum Series: Games Unit

By Monica Ko

Monica is an Assistant Research Professor at the Institute of Cognitive Science at CU 麻豆影院. At iSAT, she investigates how inclusive co-design processes can empower teachers and students with diverse identities to better understand how AI learning technologies can be used for good in their schools and com颅munities.

At iSAT, part of who we are and what we do is dedicated to creating engaging STEM and Science curriculum units that highlight the power of collaborative learning. We then incorporate our AI Partners to enrich these units further. We are excited to present a series of blog posts showcasing these curriculum units, starting with a spotlight on the Moderation unit.

What is the AI Moderation unit? 

The Moderation unit is a 2-3 week instructional unit that invites middle school students to figure out sources of bias and racism that emerge within a video game and gaming community, and envision how humans and AI might be used to imagine new kinds of gaming communities. The unit focuses on Minecraft and opens with a story about a teenager who has vastly different experiences playing Minecraft on two different servers. After reading the teenager鈥檚 story, students generate questions they have about AI, Game design, servers, and about moderation. These questions are then organized on a question board that drives the direction of the unit. In the following lessons, students investigate the kinds of moderation rules that exist across different games and gaming communities; they read about how humans and AI systems are used to moderate behavior. The curriculum encourages students to think about moderation strategies that not only ban 鈥渂ad鈥 behavior, but also those that recognize positive behaviors. These experiences lead students to think about the ideologies that underlie these moderation systems, as well as the limits and affordances of AI, Humans + AI, or Humans-only approaches to moderation. 

Students also learn about sentiment analysis, a natural language processing (NLP) technique that is used to moderate a player鈥檚 affective state during gameplay. To better understand what this actually involves computationally, students build their own sentiment bots and discuss how their lived experiences and the volume of training data influence the models鈥 predictive power. Finally, students apply these ideas to their own gameplay by creating rules for moderation and enacting them during a multiplayer game of Minecraft. From this experience, they realize how challenging the work of moderation is and the importance of both context and interpretability in making these decisions 鈥 and how AI and humans can work together to create more just moderation systems.  

How it鈥檚 Helping Students in the Classroom 

Students are really excited to bring in their expertise with video games and online communities into this unit! It is a hallmark feature of the unit and reflects iSAT鈥檚 design principle of both soliciting and building from students鈥 everyday knowledge. When students are engaging with the Moderation unit, they bring in their knowledge about servers, positive and negative experiences of gameplay, and the importance of context and relationships in deciding whether players are joking or being insulting during gameplay. We know that students have some ideas about AI and its role in online communities, and this unit deepens this knowledge by getting students to understand how moderation works, how it influences human behavior, and what is important to pay attention to when developing these systems if our goal is to create more just futures. 

One of the most powerful supports for this kind of learning is our AI Partner, CoBi! Short for Community Builder, students are introduced to CoBi at the beginning of the lesson, and CoBi is used multiple times to monitor how students collaborate during the unit. In essence, students are not only learning about moderation, but they are also experiencing AI moderation at the same time! This 鈥渕eta鈥 experience allows them to more deeply understand how AI systems work, their potential fallibility at various points of their inception, and how humans can become the co-constructors (and not merely users) of more just moderation systems. 

Furthering Our Mission of Developing AI partners 

The enactment of the Moderation unit addresses two of iSAT鈥檚 goals. First, it helps us better understand what and how students can learn about AI systems at the upper middle and lower school grade levels. There are currently no standards for AI learning in US classrooms, and this is one way for us to gather empirical data about what students are ready to explore on this topic. Second, the integration of CoBi into the Moderation unit directly supports the generalizability of the models that power CoBi鈥檚 analytical pipeline. Providing CoBi with training data across all of our Strand 3 units ensures that its impact is not just specific to one curricular unit, but that it can be used more widely across classrooms enacting different content. Third, having students simultaneously investigating and experiencing moderation makes the learning come alive in classrooms! Embedding CoBi within the AI Moderation unit creates unique opportunities for students to analyze and critique where CoBi is doing well and where it needs additional support and training data. This positions students as partners who provide critical feedback on the development and refinement of our AI Partners.

 

Minecraft Games Unit