Instagram's chief says AI images look real now. Creatives: here's how to keep trust
Instagram chief Adam Mosseri acknowledged what many of us feel in our feeds: it's getting hard to tell real photos from AI. He appreciates the creativity-even the playful "AI slop" trend-but warned that we need watermark-like tags so people know what they're looking at.
If you create visuals for clients or an audience, this isn't a small tweak. It's about trust, disclosure, and staying ahead of policy shifts that are coming to social platforms and brand contracts.
What Mosseri is pushing for
Clear, persistent signals that an image is synthetic-think watermark-like tags or embedded content credentials that travel with the file. The goal is simple: give viewers context without killing creativity. That means platform-level labels plus creator-side disclosure.
Practical steps you can take right now
- Label your work on-platform. Add "AI-generated" or "Composite (AI + photo)" in captions and alt text. Keep it short and consistent across posts and reels.
- Embed provenance. Export assets with C2PA content credentials so edits, sources, and tools are trackable beyond Instagram.
- Keep your source files. Save prompts, models, seeds, and reference photos. Clients will start asking for proof-of-process-treat it like a receipt.
- Use mixed media wisely. Blend real photos or video with AI to keep texture and believability. Small real-world details reduce the "plastic" look.
- Get consent and likeness rights in writing. If your piece resembles a real person, secure releases. Deepfake laws and platform rules are tightening.
- Create a "real vs AI" portfolio split. Separate galleries on your site and social highlights. Help buyers filter quickly.
- Note client preferences in briefs. Add a check box: "Permitted: AI generation, Composites, Retouch only." Avoid surprises later.
- Test detection tools, but don't rely on them. Treat them as a hint, not a verdict. Your best defense is clear disclosure and embedded credentials.
- Document behind-the-scenes. Short screen recordings or before/after frames build trust and make for great promo content.
What to expect next
More automatic labels on social, stronger default settings for AI tags, and wider adoption of open standards like C2PA. Brands will add disclosure clauses to contracts, and repeat offenders who avoid labeling will see reach or monetization take a hit.
How this shifts your creative workflow
- Plan for disclosure up front. Treat labels and credentials as part of the export process, not an afterthought.
- Price for transparency. Quote for ideation, generation, cleanup, and credentialing. Line items make your value obvious.
- Design with authenticity cues. Real light falloff, believable hands, natural skin, environmental noise-these details matter more as models get cleaner.
Useful resources
- C2PA: Open standard for content credentials
- Content Authenticity Initiative
- Curated AI tools for generative art (Complete AI Training)
Bottom line: creativity isn't the issue-clarity is. Label your work, embed credentials, and make proof-of-process part of your brand. You'll protect trust now and save yourself rework as platforms roll out stricter rules.
Your membership also unlocks: