AI Is Ruining My Education
I'm a university student in Ontario, and the shortcut culture is loud. It started as jokes and haikus in late 2022. Within weeks, friends were using ChatGPT to finish job applications in minutes. The message was clear: why wrestle with ideas when a bot can spit out something passable?
I tested it once for a history prompt. It produced 800 clean words in seconds-neat structure, confident tone, zero insight. No sources. No quotes. It looked like learning without the learning. That was enough for me to back away.
Meanwhile, AI footprints were everywhere. Em dashes in texts from people who never used them. Discussion posts opening with "Happy to help! Here are a few ideas," as if classmates were customer support. Bibliographies with academic-looking sources that didn't exist. The ghostwriter was easy to spot once you started looking-and hard to unsee.
What stung most wasn't the tech. It was the isolation. While others outsourced, I stared down the page. On a Tuesday paper deadline, I turned down a cheap movie night to keep writing. "Just use ChatGPT," a friend said. I didn't. It felt like effort had become optional.
Weekly reflections for a political science course were supposed to push us. "What is freedom?" "What is democracy?" Instead, the board filled with the same generic answers, rephrased a dozen ways. The work became formatting, not thinking. That's the part that scares me.
Policies are catching up. Some instructors allow AI for brainstorming. Others ban it. Detection tools get tossed around, then students run their output through "humanizers," and everything gets scanned again. Software grading software while we pretend it's education.
There's another cost: overreliance dulls the effort that cements learning. Cognitive offloading is real. AI turns the struggle into a paste-and-submit routine. The result is shallow recall and weaker judgment over time.
For educators: what's working right now
If your goal is real learning-not performative compliance-these practices help. They also reduce the loneliness felt by students who are still doing the work.
- Make work personal, local and verifiable. Tie prompts to campus events, local data, or student-collected interviews. Require photos of notes, page snapshots, or timestamped fieldwork. AI can guess; it can't attend your town hall.
- Grade the process, not just the product. Collect proposal → outline → annotated bibliography → draft → reflection. Ask for version history or short voice notes explaining changes. Weight the reflection heavily.
- Spot-check sources in the open. Randomly choose two citations per student to verify in class. Require working links/DOIs and page numbers. Have students submit a 60-second clip explaining why each source is credible.
- Add short oral defenses. Five minutes. Two questions: what's your claim, and where did it change? The goal isn't interrogation-it's ownership.
- Use in-class writing sprints. Low-stakes, timed, no devices or with restricted mode. Pair with post-class expansions so students build on their own voice.
- Be explicit about allowed vs. banned AI use. If brainstorming or editing tools are permitted, require a disclosure note: tool used, prompts, where it helped, where it failed. No disclosure, no credit.
- Stop leaning on detectors as truth. They miss plenty and flag genuine writing. If you need a primer on limitations, this overview from Stanford HAI is useful. The challenge of detecting AI-generated text.
- Shift from take-home essays to studio formats. Workshops, peer critique, and public showcases push students to build work in community. It's harder to outsource identity.
- Reduce busywork and tighten feedback loops. Students dodge when the work feels pointless. Clear goals, quick feedback, and fewer, better assignments beat volume.
Teach AI literacy without hollowing out thinking
Students don't need scare tactics. They need clarity. Where does AI help, and where does it hurt?
- Show the trade-offs. Ask students to complete a task with and without AI, then compare reasoning depth, source quality, and what they actually remember a week later. Discuss cognitive offloading.
- Model fact-checking. Have the class audit an AI-written paragraph: locate claims, verify sources, and rewrite with real citations. Emphasize retrieval over "sounds right."
- Codify ethics. Treat disclosure like citation. Using AI without attribution is misrepresentation. Using it with reflection can be responsible.
- Provide guided upskilling. If your program includes AI, give students guardrails and structured practice. Curated catalogs can help faculty assemble skill paths by role. See course lists by job.
Rebuild the feeling of school
AI didn't just bend assignments. It stripped away the shared struggle that makes learning feel alive. Bring that back.
- Host quiet hours and writing rooms. Phones away. Light structure. Collective momentum matters.
- Publish student work. Newsletters, events, public critiques. Visibility drives effort more than hidden grades.
- Celebrate drafts, not perfection. Share messy starts and improvements. Normalize revision as the point.
Policy, communicated like adults
State your stance in plain language: where AI is welcomed, where it's banned, how disclosure works, and what happens if trust is broken. Keep it consistent across sections when possible. Pair consequences with a path back: redo options, integrity workshops, and conversations about goals.
This isn't anti-technology
It's pro-learning. Good tools should sharpen thinking, not replace it. If you want a reference point for guardrails at the system level, UNESCO's guidance is a solid start. UNESCO: Guidance for generative AI in education.
Students are capable. Many still want to do the hard work. Give us assignments worth doing, communities worth joining, and standards worth upholding. The gap between easy and meaningful has never been wider. Let's make school feel like school again.
Your membership also unlocks: