AI in Higher Education: Tool or Crutch?
AI is everywhere on campus now. It helps people move faster, but it also tempts them to skip the work that builds real skill. Used carelessly, it chips away at credibility and weakens critical thinking. That's the core issue for educators and writers: where's the line between useful assistance and intellectual atrophy?
A Student and Artist's Take
For writing-heavy courses like Creative Writing and Composition, outsourcing to AI defeats the point. You can't get an honest grade on your ability if a model did the heavy lifting. It also risks your academic trust-once that's questioned, previous work comes under doubt.
Some careers, like librarianship, require depth and reliability. Relying on AI for core tasks undercuts both. And on the art side, image generators rarely help the creative process; results feel derivative and rarely serve as solid references.
There's a broader concern here: many students are heading into roles that affect lives-educators, nurses, technicians, researchers. Do you want your future doctor training their thinking on summaries and shortcuts?
What Faculty Are Seeing at Parkland
Eric Sizemore, a librarian in the Learning Commons, experiments with AI to keep pace with new tools and points students to research aids like ResearchRabbit. His caution is simple: reading a summary is not the same as reading the paper, and letting AI assemble your response means your brain didn't do the work. Like any muscle, the one you don't use weakens.
He also flags accuracy. AI can fabricate or pull from flawed sources. That means students must cross-check facts and quotes, every time. Even model providers warn that outputs can be wrong or misleading; it's on the user to verify (source).
Nikki O'Brien, who teaches First Year Experience and coordinates inclusive learning, shared a practical use: she used AI image generation to create a logo when decision fatigue and time were high. But in the classroom, her concern mirrors Sizemore's-if students let AI do their thinking, they miss the chance to develop their own voice, values, and judgment.
The Line: Assistance vs. Atrophy
There's a clear pattern in these perspectives. AI can help with speed and options. It can also quietly replace the work that builds critical reading, reasoning, and style-the exact skills higher education exists to develop.
Practical Guidelines for Educators and Writers
Use AI for
- Administrative work: schedules, checklists, rubrics, lesson outlines.
- Idea expansion: prompts, angles, headlines, questions to explore.
- Draft polish: clarity passes, tone suggestions, alternative phrasings.
- Research wayfinding: finding papers and authors to investigate (then read the originals).
Avoid AI for
- Graded core work: essays, creative pieces, problem solutions.
- Literature reviews without reading the sources yourself.
- Final judgments: grading, feedback, or recommendations without human oversight.
- Art and design decisions that should express your taste and process.
Build guardrails
- Require process evidence: notes, outlines, drafts, citations, and reflections.
- Assign oral defenses, in-class writes, and source check-ins.
- Use personal, local, or experiential prompts that AI can't fake well.
- Mandate verification: cross-check facts, quotes, data, and links.
- Teach "AI transparency": disclose what was assisted and how.
If You're Introducing AI, Do It Deliberately
Start with non-essential tasks to remove busywork, not thinking. Keep AI on a short leash: summarize to preview, then go read the full study. Let it suggest angles, then you make the call. The work that forms your judgment stays yours.
If you want structured ways to teach or learn responsible AI use for your specific role, explore curated paths by job at Complete AI Training.
The Core Principle
Develop your own skills first. Use AI to save time around the work, not to skip the work. That's how you protect credibility, maintain standards, and leave college with a brain that's stronger than when you started.
Your membership also unlocks: