Human Creativity Still Sets the Bar in an AI-Saturated Era
Generative AI is in classrooms, studios, and offices. The promise sounds big: faster drafts, more ideas, fewer blank pages. The reality, according to educational psychologist James C. Kaufman, is sharper: AI can help, but human creativity and judgment still separate good from great.
Across a new two-part study and an edited volume on generative AI and creativity, Kaufman argues AI acts less like a shortcut and more like an amplifier. Those with stronger skills get more out of it. Those without them often cap out at what the system can produce.
What the research actually found
In the study, participants completed storytelling tasks either solo or with a large language model. Researchers measured creativity, intelligence, and performance across both setups. The study has not yet been peer-reviewed.
"What we found is that creativity and intelligence still matter," Kaufman says. "Participants who were more creative without AI also tended to perform better when collaborating with AI." In short: AI didn't flatten differences; it made them more visible.
Why the skilled get more from AI
Idea generation is cheap; evaluation is everything. AI can spit out options, but it's less reliable at judging what's original, useful, or aligned to a goal. That step requires experience, taste, and metacognition.
Kaufman's framing is simple: if AI produces work at a B/B-plus level, someone already operating at an A level can use it selectively and still deliver excellent work. Someone working below that level risks ceiling their output at the model's average.
What this means for classrooms
"The goal of an assignment isn't the final product. The goal is learning how to do the work." If students rely on AI for essays or problem sets, they may turn in something acceptable while skipping the mental reps needed to build lasting skill.
Several studies suggest that when AI assistance is removed, gains in creativity and learning often vanish. Students also tend to overestimate how much they "collaborate" with AI, even when usage logs show heavy copying and pasting.
Equity isn't automatic
Creativity potential is already widespread across gender, culture, and socioeconomic status. AI doesn't magically democratize outcomes. As paid versions improve and free versions lag, access matters-and gaps can widen.
For context on responsible classroom use, see guidance from the OECD on AI in education (OECD resource).
Creative work is shifting
Entry-level creative roles-caption writing, concept art, freelance illustration-are being replaced or reduced by AI. These roles are the training ground where people build portfolios and craft. Remove them, and the pipeline narrows.
The likely outcome: a split between hobby creatives and elite, well-funded productions. Fewer rungs on the ladder means fewer people climbing it.
Education must lead, not just adopt
In a new edited book, Generative Artificial Intelligence and Creativity: Precautions, Perspectives, and Possibilities, Kaufman and Matthew J. Worwood argue AI is a tool, not a replacement. Responsible, intentional use starts with teachers and learning experts-not just technologists.
Worwood calls for transparency and alignment with learning objectives. Early grades need clear guardrails. Over time, students can earn more autonomy-but guidance remains the bridge.
Practical playbook for educators
- Define the learning objective first, then decide if AI supports or distracts from it.
- Force the evaluation step: require students to critique, compare, and justify AI outputs.
- Separate process from product: grade outlines, drafts, and reflections-not just final work.
- Be explicit about allowed uses and document them (prompts used, edits made, what was kept or rejected).
- Plan for access differences: if tools are required, provide institution-approved options or alternatives.
Practical playbook for creatives
- Use AI for volume, not verdicts: idea lists, rough comps, alt headlines-then apply your judgment.
- Keep a taste library: reference boards, swipe files, and principles you trust to evaluate outputs.
- Build unfair advantages: domain expertise, a distinct voice, and a repeatable critique process.
- Protect the pipeline: keep doing low-stakes projects to sharpen skills AI can't replicate-taste, strategy, and storytelling.
Guardrails worth adopting
- Transparency: disclose how AI was used, and where human decisions shaped the result.
- Skill tracking: test without AI at intervals to confirm actual learning and creative growth.
- Ethics and IP: verify originality and permissions; don't outsource this to a model.
- Quality standards: set baselines above "AI average" and enforce them.
The bottom line
AI isn't good or bad. It's powerful. The difference comes down to who uses it, for what, and with which guardrails.
Human creativity still sets the ceiling. AI can speed the draft. You set the direction, the standards, and the taste.
Further learning
- Explore courses and tools for educators and creatives at Complete AI Training - Courses by Job.
- Improve prompting and evaluation skills with resources under Prompt Engineering.
Your membership also unlocks: