FutureU at UZH: Trust, AI, and the University to Come

AI is moving faster than universities can respond; trust, clear norms, and human oversight point the way. Use it with intent, require disclosure, and protect independent judgment.

Categorized in: AI News Education
Published on: Dec 16, 2025
FutureU at UZH: Trust, AI, and the University to Come

Universities Under Pressure: AI, Trust, and a Practical Path Forward

Claudia Witt and Karsten Donnay are clear-eyed about where higher education stands: AI is moving fast, faster than most institutions can respond, and the default posture has been reaction. Their message to educators is simple: create space to think, set clear norms, and lead with trust and rigor.

Fast Change, Low Clarity

Much of AI works in the background, which means people only notice the surface while missing the system beneath it. That gap breeds uncertainty and gives the sense that technology sets the agenda.

Surveys show broad support for digitalization in general, but AI triggers skepticism-less about the tools and more about speed and opacity. Policy response lags, reinforcing the feeling that we're chasing developments instead of setting the direction.

Fear of Losing Control Is Real

When tools arrive overnight in workflows, classrooms, and common apps, people feel exposed. If you don't understand how the system works, it's hard to trust it.

The pace leaves little time to adjust. Expect unease. Plan for it.

Medicine Is the Proof Point-and the Warning

AI already reads images in radiology at a level comparable to specialists, and it drafts reports and documentation that can cut busywork. That's meaningful relief-if the systems are reliable and well integrated.

The model that works: human-in-the-loop. AI assists; clinicians review and sign off. The lesson for educators is the same: use AI to support judgment, not replace it.

Students Are Using AI. Pretending Otherwise Costs You Credibility.

At the University of Zurich (UZH), the stance is pragmatic: disclosure over prohibition. Students say how they used AI-proofreading, analysis, drafting-so the process stays transparent. Faculty test tools to prepare materials, but humans stay accountable.

If you lead a course or program, establish this now:

  • Publish an "AI use" statement on every syllabus: what's permitted, what requires disclosure, what's off-limits.
  • Require a short "AI methods" note with assignments that used AI (tool, version, prompts, what the student kept, edited, or rejected).
  • Do not rely on AI detectors. Instead, assess process evidence: drafts, references, notes, and an oral check when needed.
  • Keep a human review step for any AI-produced output used in official records or feedback.
  • Teach data ethics: consent, bias, privacy, provenance, and citation of datasets and models.
  • Redesign some assessments for in-class, oral, or project-based evaluation.

Use AI With Intent-Not Because It's There

Donnay's point is blunt: if we use tools just because they exist, we hand over agency. The fix is purpose. Define where AI actually improves learning, access, or quality-and where it introduces noise or dependency.

  • Map one to three specific gains per course (e.g., faster feedback, multilingual support, formative tutoring).
  • Map the risks (e.g., hallucinations, over-reliance, privacy) and set guardrails for each.

FutureU at UZH: A Space to Think Ahead

UZH's Digital Strategy Board launched FutureU to look beyond day-to-day demands. The aim: build scenarios, stress-test assumptions, and set priorities that keep education valuable in an AI-saturated era.

That requires new habits inside the university: time for reflection, ethical guardrails, and enough freedom to experiment so rules don't suffocate progress.

What Universities Offer That Industry Doesn't

Companies may move quicker and have more resources, but universities carry depth, critique, and public trust. Degrees and certifications still matter-though hiring may tilt more toward demonstrable skills over time.

The moat is breadth. Interdisciplinary and transdisciplinary work at UZH, supported by structures like the School for Transdisciplinary Studies, helps people tackle complex problems from multiple angles and learn each other's "language."

Looking to 2050: Ubiquitous Tech, New Work, Higher Stakes for Judgment

Expect AI mentors for every learner, immersive environments as the norm, and robotics in care, industry, and research. Routine tasks shrink; new roles emerge.

Here is the constant: critical, reflective thinking. If research is pulled too close to commercial goals, the public interest gets sidelined. Universities need protected space to pursue questions that don't pay off next quarter.

Trust Is the Core Asset

Trust makes teaching and research possible. People still rate universities as more credible than industry because the incentives are different and diverse views are welcome.

Openness earns that trust. Open methods, data, and results make work verifiable and reusable-habits that raise quality over time. For context on policy and ethics, see UNESCO's guidance on AI in education.

Work With Big Tech-Without Losing Independence

Cooperation is sensible; capture the benefits without ceding your agenda. People will move between academia and industry, and that knowledge flow helps both sides.

But the mission remains public: education that develops judgment, ethical responsibility, and the ability to question tools and incentives. That's how graduates carry value into business, policy, and society.

A Practical Playbook for Education Leaders

  • Publish institution-wide AI norms: disclosure, privacy, data retention, accessibility, and academic integrity.
  • Invest in staff development: hands-on workshops with live classroom use cases and human oversight patterns.
  • Refit assessments to capture thought process and application, not just polished output.
  • Create cross-disciplinary studios for real problems (health, policy, climate, civic tech) with AI as an assistive layer.
  • Set an approval path for AI tools (security, bias checks, local data rules), with a fast track for pilots.
  • Protect independent research time and funding outside commercial timelines.
  • Offer AI literacy for students and faculty, including prompt strategy, verification, and citation.
  • Measure outcomes: workload saved, learning gains, quality of feedback, equity of access-and adjust.

Useful References

Outlook

UZH's recent experience-like the Digital Society Initiative with over 1,400 researchers-shows cultural change is possible when openness and shared goals are real. The path is clear: keep independence, move with intent, and reinforce the habits that earn trust.

Do that, and universities stay relevant-and credible-well into 2050.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide