Boost or Crutch? AI's Tug-of-War Over Critical Thinking at Chapman

AI is now a classroom staple, boosting speed while raising privacy and overuse concerns. Students and faculty aim for balance: build skills and keep thinking muscles strong.

Categorized in: AI News General Education
Published on: Oct 14, 2025
Boost or Crutch? AI's Tug-of-War Over Critical Thinking at Chapman

Manufactured thinking: The effects of AI in the classroom

Generative AI moved from curiosity to classroom staple in just a few years. Since ChatGPT's 2022 launch, adoption has surged and the market is projected to hit $1.3 trillion by 2032. Teen usage doubled year over year, and more than a third of college students report regular use. As new tools from major tech firms roll out, that trend won't slow down.

On campus: opportunity and unease

With Chapman's rollout of Panther AI to boost security and protect intellectual property, students are weighing benefits against privacy risks. In a legal writing course, senior Rowan Eiselman saw AI framed as a complement, not a replacement. She uses it to stress-test research questions, spot gaps, and refine wording. Still, she's noticed a pull to default to AI and is actively resisting that habit.

"Since taking courses that encourage its use, I have noticed urges to refer to it in other aspects, which is not what I initially wanted. It is a current issue that will only continue to grow, and I want to work on ensuring I do not let it become a first instinct," Eiselman said.

Speed vs. thinking

Freshman film and television production major Amy Andujar leans on AI to simplify dense readings for Film History. It helps her remove fluff and get to the point faster. Her concern: overuse could weaken the very skills higher education is built to develop.

She points to research indicating that writing without assistance increases cognitive load and executive control, while writing with AI reduces overall neural connectivity. "Many of my professors don't encourage using AI to do our assignments, and they encourage us to reach out to them with questions and help if we feel tempted to use it," Andujar said.

Faculty responses: keep the thinking alive

Political science professor Ronald Steiner has reworked testing and assignments to reduce overreliance on tools like ChatGPT. Lockdown browsers help, but he knows workarounds exist. He now assigns more in-class writing to see students' authentic thinking and baseline skills.

"It's more spontaneous. They only have five minutes or so in class to do it, and I use it to get a reality check on what (their) writing looks like when I'm asking (them) to do it spur of the moment," said Steiner.

Careers won't wait

Students need both AI literacy and strong cognition. One study estimates that 30% of U.S. jobs could be automated by 2030, with entry-level roles most exposed and salaries under pressure as AI spreads. Steiner compares AI fluency to basic literacy.

"To me, it is maybe the 21st century version of being able to read and write with competency. It's just a part of the skill set that you have to have. So we have to be open to that. To deny it (and) to tell students they can't use it would be a disservice to the students."

At the same time: "You can't learn critical thinking if you don't exercise those muscles. If you had a robot around the house that did all of your heavy lifting for you, yeah, that would be easy, convenient and efficient, but your muscles would atrophy," he said.

Practical guardrails for students

  • Use AI to clarify, not to think for you. Ask for outlines, counterarguments, and checklists-then you write the draft.
  • Start with your thesis and key points before prompting. Compare your plan with AI suggestions and decide what to keep or cut.
  • Set "no-AI" windows for early drafts. Bring AI in for editing, structure, and clarity after your ideas are on the page.
  • Verify facts from primary or trusted sources. Treat AI output as a lead, not a source of record.
  • Protect your data. Avoid pasting proprietary or personal information into public models.
  • Keep a process log. Note where AI helped and where you made final decisions-useful for reflection and academic integrity.

Practical guardrails for faculty

  • Be explicit in your syllabus about allowed and prohibited use, plus citation expectations for AI assistance.
  • Design tiered assignments: idea generation in class, drafting outside class, and short in-class reflections or defenses.
  • Increase low-stakes, timed writing to monitor authentic voice and progress across the term.
  • Require process artifacts: outlines, drafts, prompt history, and revision notes.
  • Assess reasoning, sources, and originality more than polish. AI can polish; students must think.
  • Offer office hours and discussion channels as the first line of help so students don't default to AI.

A simple workflow that keeps you in charge

  • Frame the problem: write your question, thesis, and constraints in 3-5 bullet points.
  • Use AI for options: ask for 3 approaches, 5 risks, and 5 counterarguments.
  • Decide: pick the best pieces and build your outline. You own the structure.
  • Draft without AI for one focused pass. Then use AI for editing, examples, or citations you will verify.
  • Finalize in your voice. Disclose where AI helped if required.

Resources to build AI literacy without losing your edge

If you want structured ways to integrate AI into coursework or professional development, these guides can help:

Bottom line

AI can speed up busywork and expand your options. It can also dull your thinking if you let it. The winners will learn the tools, keep their cognitive "reps," and build judgment that no model can replace. Education at Chapman-and everywhere else-will be defined by that balance.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)