AI in higher ed at Colorado School of Mines - experts, ethics, and classroom practice

AI is in class and admin work; Mines experts share what helps now and pitfalls to avoid. Get quick wins, clear policies, and simple steps to save time without losing rigor.

Categorized in: AI News Education
Published on: Oct 28, 2025
AI in higher ed at Colorado School of Mines - experts, ethics, and classroom practice

AI in Higher Education: Practical guidance from Colorado School of Mines experts

AI is now part of daily work in classrooms and departments. Tools like ChatGPT, Copilot, and Gemini can save hours and spark better learning - and they also raise new questions about ethics, assessment, and policy.

Below is a clear, no-hype summary of what's working on campus, who's pushing the conversation forward, and how you can put their ideas to work this term.

What AI can actually do for teaching and learning right now

  • Draft lesson plans, rubrics, quiz banks, and first-pass explanations that you refine.
  • Generate quick, domain-specific examples and code prototypes to illustrate tricky concepts.
  • Create formative practice with instant feedback, including gamified activities.
  • Provide 24/7 Q&A on institutional knowledge (policies, course logistics, student support).
  • Support accessibility with summaries, alternative explanations, and multilingual help.

Experts at Colorado School of Mines

Estelle Smith, assistant professor in computer science

Focus: Ethical issues in AI and human-centered approaches to building reliable, safe, trustworthy systems. Smith is running a longitudinal study on how GenAI is adopted at Mines and teaches a graduate seminar on AI Ethics and Human-AI Interaction.

  • Action: Publish a simple, course-level AI use policy and require disclosure of AI assistance.
  • Action: Evaluate tools against risk areas like bias, safety, privacy, and transparency. The NIST AI Risk Management Framework is a practical starting point.

Zibo Wang, teaching assistant professor in computer science

Focus: Machine learning and GenAI agents that help faculty produce quality learning materials while cutting repetitive work. Also explores biometric authentication and robotics.

  • Action: Use an agent to generate draft lecture outlines, examples at multiple difficulty levels, and varied quiz items - then edit for accuracy and fit.
  • Action: Standardize prompts into reusable templates for fast, consistent prep across sections.

Kathleen Kelly, teaching associate professor in computer science

Focus: Computer science education using gamification and generative AI. Students use an AI assistant to build stories and analogies around course topics, turning review into a competitive, low-stakes game.

  • Action: Have students prompt an AI to create analogies for a new concept, then quiz peers on which analogy explains it best.
  • Action: Rotate student "prompt designers" to create practice rounds that keep engagement high.

Bo Wu, associate professor in computer science

Focus: Systems for training and optimizing machine learning models. Founder of HiTA AI, a startup that builds conversational assistants on a university's knowledge base to help students and lighten faculty and staff workloads. Deployed at Mines and piloted at Cornell.

  • Action: Start with a campus FAQ chatbot (syllabi norms, registrar, tutoring, office hours). Expand to course-specific help with curated, vetted sources.
  • Action: Set up governance: source-of-truth docs, content owners, and an escalation path for edge cases.

Carter Moulton, faculty developer, Trefny Innovative Instruction Center

Focus: Leading genAI work that supports teaching and learning - including campus guidelines, a field guide, and a prompt library - plus communities of practice and professional development. Also launched Analog Inspiration, a card deck that keeps human values front and center.

  • Action: Build a shared faculty prompt library tied to learning outcomes and discipline needs.
  • Action: Run short, repeatable PD sprints where instructors pilot, measure, and refine one AI use case.

Gabe Fierro, assistant professor of computer science

Focus: Data management and AI/ML for domain-specific applications like smart buildings and water treatment systems. Uses GenAI for quick prototyping and ad-hoc examples that make abstract ideas concrete.

  • Action: Ask an AI to produce small, realistic datasets and example code; then use them to demonstrate core concepts and edge cases in class.
  • Action: Compare multiple AI-generated approaches and have students critique trade-offs.

Justin Shaffer, associate dean of undergraduate studies and teaching professor of chemical and biological engineering

Focus: Course design and the effectiveness of teaching strategies. Open to integrating AI where it helps, and cautious where it muddies learning or assessment.

  • Action: Align AI use with learning objectives; design assessments that reward process, reflection, and original work.
  • Action: Pilot with a single section, measure outcomes, and only then scale.

Policy and practice that stand up under scrutiny

  • Make expectations explicit: what's allowed, what must be disclosed, and what is off-limits.
  • Teach verification: require students to check claims, cite sources, and reflect on how AI shaped their work.
  • Protect data: avoid feeding sensitive or student-identifiable information into external tools.
  • Plan for equity: ensure access, provide non-AI pathways, and offer training for different skill levels.
  • Assess what matters: more process artifacts (drafts, thinking steps) and more oral checks where needed.

Quick-start checklist for your next term

  • Pick two use cases: one for your prep (materials), one for student practice (feedback or analogies).
  • Choose one tool per use case and write a 3-5 sentence policy for your syllabus.
  • Create two prompt templates and a short verification rubric students can follow.
  • Pilot, gather student feedback, compare outcomes, and keep what clearly saves time or improves learning.

Helpful resources

Bottom line: start small, publish your rules, and measure impact. AI should reduce busywork and make learning clearer - if it doesn't, change the setup or move on.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)