Learning cannot be outsourced to AI - how it's actually changing university teaching
Generative AI tools like Copilot and ChatGPT are everywhere on campus. They can speed up parts of academic work, but they don't replace the core act of learning. Reading, writing, and original thinking still sit at the center. Offloading those to a model weakens the very skills higher education exists to build.
Teach responsible AI skills first
Professors Kalle Juuti and Anette Alén use generative AI in their teaching, but with guardrails. In first-year group teaching, Alén has students compare a legal case with CurreChat's responses and reflect on AI use in legal information retrieval. In an advanced course, Juuti invites students to use CurreChat at defined stages-brainstorming, feedback, and iteration-so they experience where it helps and where it falls short.
The point is skill, not shortcuts. Students learn to question outputs, cite sources, and document how AI contributed to their work.
Where AI fits-and where it doesn't
AI is useful, but not everywhere. Juuti moved assessment in an education basics course to in-person essays written in the lecture room, so students demonstrate their own reasoning on foundational concepts. No second-guessing who wrote what.
Alén highlights the need to identify, at degree level, where controlled assessments (like invigilated exams) are necessary. She also notes there are contexts where AI use can't be fully controlled; in those cases, set clear limits and guide its use rather than pretending it won't appear.
Be explicit: policy, permissions, and reflection
E-learning Specialist Sanna-Katja Parikka recommends adding an AI clause to the course description. Spell out what is allowed, what is not, and how to disclose AI use in each assignment. No ambiguity, less academic integrity friction.
Parikka also suggests building critical reflection into any AI-assisted task. Students should evaluate the tool's output, note biases or gaps, and explain how they verified facts. That reflection deepens learning and makes the process visible.
For broader context, see UNESCO's guidance on generative AI in education and research: UNESCO guidance.
Guidelines and safe tools matter
The University has published guidelines on AI in teaching since 2023, including a ban on language models in maturity tests. The Faculty of Law has added thesis-specific guidance: students may use generative AI to support writing, but must disclose and plan that use openly in advance.
Both Juuti and Alén value having CurreChat available through University credentials. It lets students work openly without turning to questionable free tools or risking data security.
Practical moves you can apply this semester
- Map assessments to learning outcomes. Decide where you must see unaided skills (e.g., in-person essays, oral defenses) and where AI support is appropriate.
- Add a clear AI clause to your syllabus: permitted uses, disclosure format, citation rules, and consequences for misuse.
- Integrate AI into the process, not the final product: brainstorming, outlining, critique, and iteration checkpoints with documentation.
- Require a reflection memo for AI-assisted work: prompts used, outputs received, verification steps taken, and what the student changed.
- Use institutionally supported tools for security and transparency. Avoid external tools when data sensitivity is high.
- Co-create standards within your discipline. Agree on where AI is welcome and where independent performance is essential.
- Budget time for redesign. Good assessment takes work; treat this as core teaching development, not an add-on.
Training and low-risk experiments
Alén recommends taking University training on AI and testing CurreChat with topics you already know well. That makes it easy to judge answer quality and set boundaries for students. In her course, she piloted an AI chatbot representing a client and had students run the same legal discussion at the start and end of the course-clear before-and-after evidence of learning and AI literacy.
Juuti underscores the value of department-level collaboration to define where AI helps and where human-only demonstration is required. That conversation takes time, and it's time well spent.
If you're looking for structured practice, explore curated AI courses by job role: Complete AI Training - Courses by Job.
The bottom line
Use AI to speed up the busywork and improve feedback loops. Keep the core learning-reading, writing, reasoning-squarely in the student's hands. That balance builds durable skills and preserves academic integrity without ignoring the tools students will face in real work.
Your membership also unlocks: