CETLA at GCU: Reducing AI Anxiety and Raising Literacy in Academia
Logging into an AI model for the first time can feel uncertain. Grand Canyon University addressed that head-on by launching the virtual Center for Educational Technology and Learning Advancement (CETLA) and embedding AI champions inside each college.
In the College of Natural Sciences, Professor Dr. Mark Wireman and instructor Scott Rex carry that banner. CETLA is co-directed by Rick Holbeck and Dr. Jean Mandernach, who set clear, ethical guidelines that colleges adapt to their courses and disciplines.
Clear policy lowers anxiety
Holbeck keeps the definition simple: AI is machine learning that ingests massive amounts of data, and its outputs reflect human inputs, programming, and bias. That's why policy matters.
Wireman shared a student's view from home: "This new policy gave her … anxiety because she uses AI for brainstorming on her essays." The solution was clarity. Faculty can permit AI use with disclosure, and students know what's allowed.
After CETLA's rollout, colleges defined where AI fits. In some math and science courses, GCU offers MOSAIC, a ChatGPT-based model tuned for class use.
Transparency first: "Use it, but tell me you used it"
Rex's classroom rule is simple and practical. "You can't use it to write your paper, but you can use it to help rewrite your paper," he said. "If you do, just tell me that you used it to rewrite your paper."
That transparency builds trust and helps students think about process, not shortcuts.
From classroom to career: show how AI is used on the job
Policy is only useful if it reflects real work. In medicine, AI "listens" to provider-patient conversations and drafts clinical notes. "I had a meeting with a speaker and a bunch of students. AI created a summary that was spot on … All I had to do was reformat it," said Wireman. That mirrors how many clinics reduce paperwork today.
Rex brought similar realism to forensic science. He used AI to play the role of an attorney and question students under pressure. Court cases can hinge on expert testimony; practice matters. Forensic science students, including Genesis Vera and Christian Merryman, called the AI-attorney experience eye-opening.
Teach verification, not blind trust
Mandernach notes what every educator has seen: AI can hallucinate. Wireman's classroom cue is direct: "Double-check it. Focus on what makes sense."
Rex pushes source quality: "You could end up with some really cool-sounding facts that are made up, and you won't know if you don't verify afterward. I make them cite their sources. Is it Frank's toxicology blog or the National Institutes of Health?" For a model of credible references, point students to the National Institutes of Health.
For program-level guardrails, consider the NIST AI Risk Management Framework as a reference when updating academic policies.
A secular tool that can test and strengthen faith
Rex sees another dimension of learning: AI's secular lens can challenge and build convictions. "What would happen if you asked it very pointed questions about the Bible and asked it to analyze it from an AI religious perspective?"
Wireman agrees: let students see a secular analysis, get a little shook, and then work through it. The friction can refine beliefs and sharpen reasoning.
Tech shifts fast-learn together
Wireman points out the pace: last year, image generation felt rough; this year, it's smoother; video is catching up next. The takeaway isn't to chase every feature-it's to keep learning in public.
"I'm learning this like everyone else," said Rex. That shared mindset-learn, question, and iterate-has done more to ease AI anxiety than any single tool.
Practical steps you can apply this term
- Publish a short AI policy in every syllabus: what's allowed, what's banned, and how to disclose use.
- Differentiate drafting from revising: permit AI for outlines, rewrites, or feedback-but not full authorship.
- Require citations and a post-check: students list sources and verify key claims.
- Bring in real workflows: note-taking assistants, meeting summaries, coding helpers, or literature screening.
- Use AI as a simulator: oral exams, cross-examination practice, mock interviews.
- Create a faculty "AI hour": share prompts, pitfalls, and wins once a month.
- Adopt campus-approved models (like MOSAIC) when available to keep data and policy aligned.
If you want structured upskilling for your role, see curated AI learning paths by job at Complete AI Training.
How AI was used in the preparation of this article
As in part one, ChatGPT was used to summarize and identify relevant quotes, then create an outline that continued the story from the completed part one. The outline was condensed from nine sections to four, and ChatGPT was asked to pull relevant quotes from uploaded transcripts to flesh out each section.
The writer used or paraphrased those quotes within the narrative bridges. Comments by the forensic sciences students were pulled from two interviews for a future story on a College of Natural Sciences program. AI wrote no content. The ethical and responsible use of AI in developing this article enabled the creation of a completely human-written story. The story was edited through human intervention.
Your membership also unlocks: