AI ads perform as well as humans-if they don't look AI-made
Fresh field data across 500 million impressions and 3 million clicks shows AI-generated ads can match human creative performance. On average, AI ads posted a 0.76% CTR vs. 0.65% for human-made. When researchers compared "sibling" ads within the same campaigns and days, performance was statistically equivalent.
The takeaway for creatives is simple: AI is not the bottleneck-perception is. If the ad looks artificial, it underperforms. If it reads and feels human, it competes.
What the study actually measured
Researchers from Columbia, Harvard, the Technical University of Munich, and Carnegie Mellon analyzed live campaigns on Taboola's Realize platform. They compared matched pairs of AI and human ads built by the same advertiser, for the same campaign, on the same day. That setup controls for timing, targeting, and landing pages-so the only real variable is how the creative was made.
The result: AI creative kept pace with human work without hurting downstream conversions. Higher (or equal) CTR did not come with worse conversion rates.
Perception drives performance
Participants in a separate perception study labeled whether an ad looked AI-made or human-made. Ads perceived as AI-generated underperformed-regardless of their true origin. That means the audience is punishing "artificial vibes," not the tool used.
Nearly half of AI ads were perceived as human-made. Those ads outperformed both human-made ads and AI ads that looked synthetic.
What makes an ad "look AI"
- Overly polished, stylized visuals that feel unreal
- Heavy saturation and hyper-clean color treatments
- Strong, unnatural symmetry and perfect geometry
The strongest signal of "human-made" was a large, clear human face. Human warmth beat AI polish across both AI and human creatives.
Practical playbook for creatives
- Lead with faces. Use clear, authentic human imagery with eye contact and natural lighting.
- Dial back the "too perfect." Reduce saturation, avoid mirrored symmetry, leave some texture.
- Run sibling tests. Ship AI and human variants in the same campaign and day to isolate impact.
- Track both CTR and CVR. Don't assume curiosity clicks-verify conversion parity in your analytics.
- Tune copy for warmth and specificity. Simple benefits, everyday language, and a clear next step.
- Pretest perception. Quick panels or internal reviews: "Does this look AI?" If yes, adjust.
- Bank winning patterns. Faces, natural color, mild imperfection, and grounded headlines.
Category patterns worth noting
Food and drink, and personal finance showed strong early results with AI creatives. Education was more muted. Industry context matters, so test locally rather than importing rules.
Tools that shaped outcomes
Ads made with Taboola's GenAI Ad Maker included prominent faces more often than human-made work, reflecting Creative Shop best practices. Taboola also reported Predictive Audiences delivering large conversion lifts for early adopters, and an AI assistant (Abby) that speeds campaign setup.
Across the ad tech stack, platforms continue adding creative analytics and AI tooling. Features that break down performance by creative element help you iterate faster without guesswork.
What this means for your process
- Use AI to scale variants, not to crank out sterile visuals.
- Bake in "human signals" by default: faces, micro-imperfections, real texture.
- Set a perception gate before launch: if it looks synthetic, it likely performs worse.
- Measure quality down-funnel. Keep CTR honest with conversion checks.
- Document what "looks human" for your brand and make it a checklist.
Bottom line
AI creative can pull its weight. The winning ads don't hide that they were made with AI-they just don't look artificial. Keep the human cues visible, ship matched tests, and let the metrics decide.
Further reading
Columbia Business School
AI tools for copywriting: practical picks
Your membership also unlocks: