loading . . . AI Isn't A Ticket to an A I teach a freshman seminar on kids and society. For their first project, my students create an infographic that aims to clarify a common misconception about “kids these days.” For reasons I’ll outline in more detail below, I strongly prefer that my students not use AI (artificial intelligence) tools to create their projects. So, I made my own infographic explaining why. And I spend a class period showing them how I made the infographic with very basic tools (in this case, PowerPoint), and without using AI.
Here’s a PDF of the infographic that you’re welcome to use/share/adapt for your students: AI Isn't a Ticket To An A
As for why I’d prefer that students not use AI tools to produce assignments for my class, here’s the more detailed explanation that I include in my syllabus:
It can be tempting for students to turn to large language models (LLMs) and generative “artificial intelligence” (“AI”) bots—like ChatGPT or Google Gemini—for assistance with coursework. However, for reasons I will outline below, I strongly discourage you from using these tools to complete assignments in this class.
1. **Generative “AI” chatbots produce work that is, at best, mediocre and, at worst, blatantly wrong.** These chatbots are trained on information scraped from across the internet—from online books and articles to social media posts and comments on online forums. What they produce is generally not vetted for quality, accuracy, or potential bias. As a result, AI-generated information is frequently riddled with blatant factual errors, fabricated references, plagiarized passages, and messages that reinforce stereotypes and biases against stigmatized and marginalized groups. Moreover, the quality of AI-generated writing tends to lack the creativity and critical thinking that professors typically expect from students in college.
So, before using these tools for your assignments, ask yourself: _Am I a better or worse writer than the average person on the internet?_ And: _Do I know more about this topic than the average source online?_
If you’re better than average, using AI will probably produce an assignment that is lower in quality and accuracy than what you could have produced on your own. And if you’re not better than average, then the best you’ll likely produce with AI is a mediocre assignment. And in that case, you’re probably better off getting writing help from a human, such as by visiting the Writing Center for support (https://writing.wisc.edu/) or by making an appointment to chat with me during office hours.
2. **Relying on generative “AI” chatbots can weaken your critical thinking skills and even harm your mental health.
** Before smartphones, people used maps and compasses to navigate when driving, and they regularly memorized the phone numbers of long lists of family members and friends. Today, our smartphones—with their GPS technologies and contact lists—can do those things for us. And so, many of us no longer develop or practice those skills. Which can put us in serious danger if we find ourselves lost or in trouble in a place where our data coverage doesn’t reach.
In a similar way, generative “AI” chatbots endanger our critical thinking skills by offering to carry some of the cognitive burden that we face in our day-to-day lives. Research shows, for example, that people—and especially young people—who use AI more frequently are less able to: 1) detect misleading information, 2) evaluate the strength of an argument, 3) determine whether information confirms or refutes a hypothesis, 4) correctly estimate how common something is, and 5) use information effectively to solve problems and make sound decisions. By weakening people’s critical thinking skills, generative “AI” chatbots also appear to make users of these technologies more dependent on them over time. In the process, these technologies may also put people at risk of “AI Psychosis,” a condition that results when frequent prolonged use of generative “AI” chatbots reinforces and exacerbates delusional thinking.
3. **Generative “AI” chatbots exploit creators and environmental resources.
** To train their generative “AI” chatbots, companies have relied heavily on copyrighted materials, which they have used without permission, effectively stealing them from writers, artists, and creators—including me. For example, all four of the books that I’ve written, along with most of my academic articles, are part of the LibGen database of pirated materials that Meta—the parent company of Facebook and Instagram—used to train its generative AI tools. The companies that produce AI tools also rely on underpaid labor performed by precarious workers in low-income countries around the world.
Meanwhile, generative “AI” chatbots are also damaging the environment. Because of the intensive computing involved, these technologies use huge amounts of energy. In 2024, for example, ChatGPT’s daily energy usage was equivalent to that of 180,000 US households. Growing use of these technologies is also projected to substantially increase demand for electricity, fresh water (which is used to prevent the technology from overheating), and rare elements and minerals (which are often unsustainably mined). Together, these trends threaten to exacerbate climate change and related environmental crises, with a disproportionate impact on already vulnerable and marginalized communities.
…
Despite these problems, I will not subject your assignments to “AI detection” software. Such software is prone to errors. It has known biases, including against those students whose first language is not English. And it is also being used to fuel a culture of surveillance and punishment that disproportionately impacts students from racially marginalized groups and neurodiverse students.
That said, and as I discussed with point one above, please know that use of LLMs and generative “AI” chatbots for class-related purposes may lead you to produce work that falls short of my expectations for students and may thereby negatively impact your grade. http://www.jessicacalarco.com/teaching-resources-1/2025/9/25/ai-isnt-a-ticket-to-an-a