AI Slop: College Crisis
AI has moved into classrooms faster than policy, pedagogy and assessment can keep up. That gap is creating tension on campus-between what's possible and what's wise, between convenience and scholarship.
Two clear positions are emerging. Some faculty treat AI as a support tool with guardrails. Others see it as a direct threat to learning and academic integrity. Both camps want the same outcome: students who can think.
Two Views From the Classroom
Economics professor Luis Eduardo Azmitia Pardo sees AI as useful-when it stays in a supportive role. Students can use it to brainstorm, organize writing or clarify concepts. But the core work still has to be theirs: critical thinking, analysis and reflection.
He draws hard lines, too. For deeply personal work-phenomenological or contemplative reflections-AI is off-limits. The point is to help students learn, not outsource the struggle that builds skill.
On the other side, English instructor Cara DiGirolamo is skeptical about AI in academia, especially large language models like ChatGPT, Gemini and Grok. "If you are tempted to cheat, AI makes it easy to cheat. If you don't like to struggle, AI makes it possible to avoid struggling. Choosing not to use AI is possibly the best thing someone can do for their moral fibre."
DiGirolamo argues that if universities formally embrace AI, they must rebuild the environment to keep thinking at the center: smaller classes (15 max), oral exams without aids and even asking students to build a simple neural net to see how and why AI can fail at research.
What This Means for Your Course or Program
You don't need perfect consensus to move. You need clear intent, transparent rules and assessments that reward thought over shortcuts. Here's a practical starting point for educators and academic leaders.
- Publish a short "AI use" policy per course. Specify what's allowed (idea generation, outlining, language support) and what's prohibited (personal reflections, original analysis, literature synthesis unless cited). Require students to disclose and cite any AI assistance.
- Assess the process, not just the product. Use versioned drafts, planning artifacts, research notes and revision memos. It's harder to fake a learning trail.
- Shift key assessments to formats that surface thinking. In-class writing, oral exams, viva-style defenses and whiteboard walkthroughs reduce dependence on AI outputs.
- Design "AI-contrast" assignments. Pair an AI-assisted task (with documentation) with a no-AI, in-class follow-up. Compare depth, accuracy and insight.
- Protect reflective and experiential work. Keep personal narratives, field notes and phenomenological reflections AI-free to preserve authenticity.
- Use AI for access, not avoidance. For multilingual learners and students who struggle with structure, allow AI as a scaffold-then grade the human reasoning.
- Right-size where you can. Smaller seminars, more oral checks and lab-style coaching help keep thinking visible. If class sizes can't drop, add rotating oral "spot checks."
- Talk openly about reliability. Show how AI hallucinates sources and fabricates facts. A simple demo beats a policy memo.
- Mind privacy and data ethics. Clarify which tools are approved, how data is used and what stays off-limits. Favor institutionally vetted tools for any graded work.
- Invest in faculty upskilling. Run short workshops: prompt clarity, AI citation norms, process-focused assessment, and discipline-specific use cases.
A Simple Course Policy Template
- Allowed: Brainstorming, outlining, grammar and clarity support; code stubs with documentation; study guides.
- Required: AI disclosure in a footnote or appendix: tool, prompts used, and how outputs were edited.
- Prohibited: Full-draft generation; fabricated citations; personal reflections; any uncredited AI text or analysis.
- Assessment: Process artifacts count for X%; oral verification may be required for any submission.
- Academic integrity: Suspected misuse triggers an oral check. Inability to explain work is grounds for conduct review.
If Your Institution Chooses to Lean In
Keep the intellectual bar high. If adopting AI at scale, pair access with structure: smaller discussion groups, more oral defense, clear AI documentation norms and program-level audits of assessment quality.
For departments with technical capacity, consider DiGirolamo's challenge: have students build a simple neural net or run a toy language model. It demystifies the tool and exposes failure modes that matter for research.
Policy and Guidance You Can Borrow
If you're drafting or revising policy, start with established frameworks and adapt locally.
- UNESCO guidance on generative AI in education - clear principles for safe, equitable use.
- EDUCAUSE resources on AI in teaching and learning - practical briefs for higher ed leaders.
Faculty Development and Next Steps
Most programs don't need more tools-they need shared standards and habits. Start with one course, publish the policy, test two assessments and review outcomes in a month.
If your team needs a structured way to get up to speed, browse AI learning paths by role and skill focus here: Complete AI Training.
AI isn't a verdict on higher education. It's a mirror. The choices we make-what we permit, what we protect and what we assess-signal what we value. Decide that clearly, and the tools fall into place.
Your membership also unlocks: