How AI will transform higher education in the next two years
Generative AI arrived on campus fast. A new study from Chalmers University of Technology uses scenario-based storytelling-grounded in interviews with students and workshops with university staff-to sketch six near-future possibilities for teaching and learning.
These scenarios aren't predictions. They're tools to help leaders decide what future they want to build-and what to avoid.
What the research did
Researchers gathered student perspectives through interviews, then convened teachers, postdocs, and educational developers to turn those insights into short, data-informed stories. The method-called informed educational fiction-helps surface practical consequences and decision points that a policy memo can miss.
Read the study in Learning, Media and Technology: DOI: 10.1080/17439884.2025.2562405.
What could change
- Student learning: More AI co-writing, drafting, and feedback. Stronger focus needed on problem framing, critique, and iteration.
- Assessment: Traditional take-home essays become less reliable. Authentic, oral, and process-based assessment gains value.
- Teacher roles: More coaching, less content delivery. Course design and feedback quality matter more than ever.
- Support systems: Demand grows for guidance, ethics, and practical workflows. Ad hoc experimentation no longer scales.
- Campus culture: Clear norms on acceptable AI use reduce uncertainty and conflict across departments.
Risks to avoid
- Fragmentation: Every course sets its own rules; students get mixed signals; confusion spreads.
- Inequity: Students with better tools or skills get an unfair edge; support isn't evenly available.
- Overreliance: Outputs look polished while understanding stays shallow; critical thinking erodes.
- Policy lag: Integrity policies, accessibility, and data use remain unclear; disputes escalate.
- Faculty burnout: Constant tool-chasing without time, training, or recognition.
If we get it right
With coordination and support, AI can drive real renewal: better feedback loops, richer projects, and more time for high-value teaching. Without it, expect confusion, conflict, and stalled progress.
Practical moves for the next 12-24 months
- Set clear principles: Define acceptable AI use by course level and purpose. Put it in every syllabus.
- Redesign assessment: Move toward authentic tasks, drafts with version history, oral defenses, and in-class creation.
- Skill up faculty: Offer short, paid micro-trainings on prompt strategy, feedback workflows, and assessment redesign.
- Appoint owners: Name a cross-functional team (teaching and learning center, IT, legal, ethics) to keep guidance current.
- Run small pilots: Test AI-supported activities in a few courses per department; measure learning and workload.
- Partner with students: Co-create course guidelines; gather feedback on what's fair and what actually helps learning.
- Upgrade integrity policies: Be explicit about allowed tools, disclosure, citation of AI assistance, and consequence tiers.
- Provide access: Ensure equitable, privacy-conscious tools for all students and staff.
- Mind data and privacy: Set rules for model choice, data handling, and prompt hygiene; keep sensitive data out of public models.
- Share templates: Publish ready-to-use prompts, rubrics, and workflow checklists so teachers don't start from scratch.
Use scenarios to make better decisions
Bring faculty and students into a short workshop. Sketch a few plausible stories for your institution. Stress-test current courses against each story: What breaks? What thrives? Decide in advance how you'll teach, assess, and support under those conditions.
Resources
- Learning, Media and Technology study (2025)
- UNESCO guidance on generative AI in education
- Curated AI upskilling paths by job role
The takeaway: pick a direction, set clear rules, and support your people. The tools will keep changing; your principles and practices shouldn't drift with them.
Your membership also unlocks: