AI in Higher Education: Teach Judgment, Not Just Tool Use
In a recent conversation with MPR News, Thomas Feeney, associate professor of philosophy and director of the Master of Arts in Artificial Intelligence Leadership Program at the University of St. Thomas, made a clear case: colleges should prepare students to use AI with purpose and to think through its consequences.
His message is simple: move beyond button-clicking. Ask students to create something they couldn't have produced without AI-and to show how they shared it with others.
What Feeney Saw in the Classroom
Feeney described an AI ethics course where students were allowed to use AI for the first assignment. Many pasted the prompt into a tool, uploaded the result, and called it done.
"Rather than initiate a sort of disciplinary oppositional setting, I tried to show them, look, what you what you've produced is kind of generic … and this gave the students a chance to recognize that they weren't there in their own work. This opened the floodgates," he said.
His takeaway: the goal isn't punishment-it's reflection, ownership, and better work.
Redefine the Outcome
Feeney's guidance cuts through the noise: "I think the focus should be less on learning how to work with the interfaces we have right now and more on just graduate with a story about how you did something with AI that you couldn't have done without it. And then, crucially, how you shared it with someone else."
That shift reframes assessment around process, authorship, and impact-skills that transfer across tools and semesters.
What This Means for Educators
- Require a process log: prompts, iterations, model settings, drafts, and edits. Grade it.
- Ask for a "voice pass": students explain where their thinking shows up beyond the tool's output.
- Build in critique: students identify what the AI missed, got wrong, or made generic-and how they improved it.
- Make sharing part of the brief: present to a peer group, community partner, or campus office; document the feedback.
- Set clear use policies: what's allowed, what must be cited, and what must be disclosed in the process log.
- Assess originality and impact over pristine prose: reward problem framing, method, and useful outcomes.
Sample Assignment You Can Use Next Week
- Problem: Identify a real campus or community need (advising, accessibility, outreach, sustainability, etc.).
- Make: Use AI to draft, test, and refine a solution (guide, chatbot flow, outreach plan, dataset cleanup, rubric, or prototype).
- Document: Submit prompts, drafts, model details, and a short reflection on your decisions.
- Share: Present to a relevant audience; include their feedback and your revisions.
- Reflect: What could you do with AI that you couldn't do before? What remains your work? What would you improve?
Suggested Course Outcomes
- Students disclose and critique their AI use with evidence.
- Students improve AI outputs with domain knowledge, ethics, and style.
- Students produce an artifact that serves a real user and survives outside the classroom.
- Students communicate tradeoffs: accuracy, bias, privacy, cost, and time.
Policy and Ethics Resources
When you need scaffolding for risk and responsible use, these references help align classroom practice with institutional standards:
Next Step
If you're curating options for faculty or student development, see AI courses by job for role-specific learning paths you can plug into advising or professional development.
Your membership also unlocks: