Teaching With AI, Not Replaced by It: 200 University Leaders Set Ground Rules for Higher Education

University leaders agreed: treat AI as a core skill, set clear guardrails, and upskill faculty. Do this to boost access and quality-without letting bias or shallow thinking grow.

Categorized in: AI News Education
Published on: Nov 10, 2025
Teaching With AI, Not Replaced by It: 200 University Leaders Set Ground Rules for Higher Education

Higher Education in the AI Era: What University Leaders Agreed On-and What to Do Next

Nearly 200 university leaders from 40 countries met at Zhejiang University to answer a blunt question: how should higher education respond to AI right now? The consensus: treat AI as a core skill, build guardrails for ethical use, and invest in faculty capability so students don't get replaced by tools they don't understand.

Leaders emphasized two priorities. First, AI can help close gaps in access, raise teaching quality, and accelerate research collaboration. Second, without clear policy and training, it can deepen inequity and weaken critical thinking.

Why the urgency

Colin Bailey, President, Queen Mary University of London: "It's incredibly urgent... CEOs are investing millions into AI systems to increase productivity." If industry moves faster than universities, graduates will feel the gap on day one.

Wu Fei, Dean of Undergraduate School, Zhejiang University: "We want every student to understand AI, use AI and create AI." That's the bar.

Set the direction: Human + machine, not either/or

Educators at the forum were aligned: students should use AI to think better, not think less. The job is to keep human judgment at the center while using AI for speed, breadth, and iteration.

Joanne Wright, Deputy Vice-Chancellor (Education), University of Sydney: "We allow our students to use AI, but they have to tell us. We also couple that with some assessments where they are not allowed to use AI." Clear, practical, enforceable.

What universities can implement this academic year

  • AI literacy for all disciplines. Offer a short core module for every student: how modern models work, prompt strategies, verification, bias, privacy, and citation.
  • Tiered use policy. Default to "allowed with disclosure" plus clearly defined "no-AI" assessments. Require method notes when AI is used: tool, version, prompts, outputs, and human edits.
  • Assessment redesign. Mix oral defenses, in-class writing, personal data sets, process logs, and applied projects. Grade the thinking, not just the final text.
  • Faculty enablement. Fund micro-upskilling, peer labs, and release time to redesign courses. Start with 2-3 priority departments and expand with templates.
  • Ethics and bias practices. Teach model limitations, dataset provenance, and fairness checks. Require students to compare AI outputs with trusted sources and document disagreements.
  • Access and equity. Provide institution-wide AI tools so access doesn't depend on a student's wallet. Pair access with training and clear conduct expectations.
  • Research collaboration. Create shared infrastructure, data agreements, and cross-institution labs. Build incentives for reproducibility and data stewardship.

Policy guardrails that actually work

  • Allowed uses: ideation, outlines, code review, data cleaning, study aids-when disclosed.
  • Prohibited uses: undisclosed ghostwriting, fabricated sources, privacy breaches, unsafe code.
  • Disclosure standard: name the tool, version, prompts, and what was kept/edited. Attach raw AI output in an appendix when feasible.
  • Integrity strategy: focus on assessment design and process evidence; detectors are unreliable and should not be the primary control.

Infrastructure checklist

  • Secure, institution-managed AI workspace with logging and privacy controls.
  • Model diversity (general, code, vision) to fit different disciplines and tasks.
  • Data protection by default; student consent flows and clear retention periods.
  • Accessibility support and GPU quotas aligned to course needs.
  • Usage analytics for curriculum improvement, not surveillance.
  • Faculty helpdesk and quick-start guides embedded in LMS.

30-60 day plan for faculty capability

  • Week 1-2: short workshops on prompts, verification, and ethical use. Share a common rubric and policy language.
  • Week 3-4: convert one assignment per course to "AI-aware" or "AI-free" with a rationale. Pilot in two courses per department.
  • Week 5-8: run weekly show-and-tell sessions; publish winning examples and templates to the LMS.

If you need structured options for staff upskilling, explore practical paths via Complete AI Training: Courses by Job or browse the latest AI courses.

Keep ethics front and center

Colin Bailey underscored it: AI can be biased and wrong. Students must learn to test outputs, cite sources, and make the final call as humans-every time.

Bottom line

AI is now a baseline competency, not a bonus skill. Universities that set clear policies, train their people, and redesign assessments will graduate students who can use AI well-and think even better without it.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)