AI Literacy Happens At SCI: New Course Offers Foundations for AI's Informed Use

December 15, 2025

"People often assume artificial intelligence (AI) refers only to tools like ChatGPT or other generative technologies. That is simply not the case. AI is underlying many of the technologies we use every day," said Angela Stewart, an assistant professor with SCI's Department of Informatics and Networked Systems (DINS).

Angela Stewart
Assistant Professor Angela Stewart

Since AI is becoming more pervasive in everyday tools, DINS is introducing a new course in the spring 2026 semester "AI Literacy: Foundations for Critical Thinking and Informed Use", developed by Stewart and DINS department chair Daqing He, explores AI's diverse applications in modern society and aims to cultivate students' ethical and analytical understanding through discussion-based learning.

According to He, the rapid growth of AI has created an urgent need for broader public understanding of how these systems shape daily life, decisions, and institutions.

"At DINS, our mission is to explore the intersections of information, networks, and human behavior, and AI sits squarely at that crossroad," he said. "We recognize that many students encounter AI constantly, yet often without a clear understanding of what it is, how it works, or what values and assumptions guide its use. This course reflects our belief that AI literacy is now a core competency for informed citizenship - not just a technical skill."

The course was born out of a need for students not only to understand the technology they interact with and the ways their data feeds these systems, but also to develop a "moral compass" for navigating a technology-saturated world. As explained by Stewart, who designed the curriculum and will be teaching the class, "I am not approaching this course from a pro-AI or anti-AI perspective. Instead, my goal is for students to learn about the real societal harms that these technologies can cause, while also seeing the power and usefulness of these technologies."

Through real-world examples and discussions about algorithmic impact, persuasive design, and students' own digital concerns, the course offers a clearer picture of the function of AI, and why understanding these processes is essential to informing citizenship of today.

"In one of my lectures, we are going to peel back the layers of what is happening on the TikTok 'for you' page," Explained Stewart. Using TikTok and other algorithmic applications as case studies, the class examines how platforms draw on large datasets to tailor content and influence user behavior.

Building on these explorations, Stewart's course adopts a value-based, discussion-driven approach that encourages students to reflect on what humans uniquely contribute, what AI systems do best, and how to strike a thoughtful balance between the two. Stewart and He invite students to consider how these distinctions shape their academic, personal, and professional lives, as well as broader societal outcomes.

"When you put values at the forefront, it allows students to now make decisions about what they deem to be helpful and harmful," said Stewart. "I can't prescribe that moral compass for them, but I can encourage their critical thinking to find that moral compass for themselves."

Unlike traditional AI-focused courses, AI Literacy does not center on building or programming new technologies. Instead, it examines AI through a people- and society-centered lens, focusing on its influence on work, education, creativity, and everyday decision-making. By confronting the ethical, social, and cultural questions raised by emerging technologies throughout the semester, students develop the critical awareness needed to navigate an increasingly algorithmic world.

Elizabeth Nielsen (A&S '27)