Can AI make us more creative? A large study says yes-if you use it the right way
AI is often treated like a shortcut. Do the task faster. Get to the answer quicker. But new research from Swansea University suggests a different role: AI as a creative sparring partner that keeps you exploring instead of settling.
In one of the largest tests of human-AI collaboration in design, more than 800 people used an AI-assisted system to build virtual cars. When the system surfaced diverse suggestions-good, weird, and even deliberately flawed-people stuck with the work longer, produced better designs, and felt more engaged.
How the system sparked ideas
The tool used a method called MAP-Elites to generate visual galleries across a wide spread of possibilities. Instead of optimizing behind the scenes, it showed options in plain sight: high-performers, oddballs, and "bad" ideas that still pushed thinking.
That variety mattered. It prevented early fixation and nudged people to explore beyond their first hunch. As one of the study's leads, Dr. Sean Walton, put it, the value wasn't speed-it was the creativity that came from structured diversity.
What changed for participants
- More time on task-by choice, not obligation.
- Higher-quality outcomes-measured by the design game's performance metrics.
- Deeper engagement-people reported feeling more involved and curious.
- Broader exploration-even "bad" ideas helped break assumptions and open new directions.
Why this matters for creatives
Most tools optimize for efficiency. Creative work thrives on variety. If your AI workflow only chases the "best" answer, you cut off the paths that lead to original work.
The study also challenges how we judge AI. Clicks and copy rates miss the human side of creativity-how something makes you think, feel, and explore. That's where the real value shows up.
Make AI your creative sparring partner
- Go wide before you go deep: Ask for breadth first. "Give me 20 very different directions, including 5 that likely won't work and why."
- Force contrast: Request extremes-minimal vs. maximal, safe vs. risky, conservative vs. experimental.
- Keep a "useful failures" column: Save flawed ideas that reveal constraints, edges, or new styles.
- Time-box exploration: Spend 10-15 minutes scanning a gallery; pick 3 directions to push.
- Alternate generate → critique: Iterate in loops: produce options, score them against your criteria, repeat.
- Track surprise: If nothing surprises you, change the prompt to increase diversity or randomness.
Rethink how you measure progress
Replace shallow metrics with signals that matter to creative output:
- Divergence count: How many distinct directions did you explore before choosing one?
- Iteration depth: How many refine cycles did your top ideas get?
- Risk score: Did you pursue at least one uncomfortable direction?
- Engagement: Time on task, voluntary returns to the work, and your subjective energy level.
- Outcome quality: External feedback, performance metrics, or user tests-whatever fits your craft.
What this means for your workflow
Don't ask AI for the "best" idea first. Ask for a gallery that maps the space. Use contrast to spot what's promising, then converge with intention. The goal isn't faster answers-it's better questions and braver bets.
Read the research
The study is published in ACM Transactions on Interactive Intelligent Systems. You can find it here: From Metrics to Meaning: Time to Rethink Evaluation in Human-AI Collaborative Design.
Curious about the method behind those diverse galleries? See quality diversity algorithms (MAP-Elites).
Build your AI stack for creative work
If you want curated tools and courses by craft, explore our picks for visual creators: Top AI tools for generative art.
Your membership also unlocks: