UnbAIsed: Giving AI Better Memories to Break Gendered Stereotypes

UnbAIsed is a gender-neutral AI image library that breaks tired nurse/CEO stereotypes. Use and contribute neutral visuals so teams build fresher, fairer work.

Categorized in: AI News Creatives
Published on: Mar 10, 2026
UnbAIsed: Giving AI Better Memories to Break Gendered Stereotypes

UnbAIsed: Fixing gender bias in AI images so creatives can build better worlds

Ask an image model for a "nurse" and you'll often get women. Ask for a "CEO" and you'll often get men. That's not creativity. That's yesterday on repeat.

21N78E Creative Labs launched UnbAIsed to challenge that loop. It's a gender-neutral AI image library built to give creators a cleaner starting point-and give models better examples to learn from.

The quiet bias hiding in your prompts

AI isn't neutral. It learns from the internet, and the internet reflects history-bias included. That's why stereotypes show up in outputs, even when you don't ask for them.

For agencies and studios, this matters. The images we choose don't just tell stories; they set expectations. If our tools keep recycling old roles, our ideas hit the same ceiling.

What UnbAIsed is building

At the center is a growing, gender-neutral image library: diverse people across professions, roles, and everyday moments-clearly tagged and free to use. It's open by design, inviting students, creators, and brands to contribute their own neutral, bias-aware images.

The aim is simple: provide alternative visual references that nudge models away from stereotypes. Over time, better examples can lead to better defaults.

Why this matters for creative teams

  • Pitch decks and moodboards stop defaulting to clichΓ©s.
  • Casting, storyboards, and social visuals reflect your audience-without tokenism.
  • Brand safety improves when your visuals don't reinforce dated roles.
  • Your team ships work that's fresh, accurate, and culturally aware.

How it's built

UnbAIsed uses Gemini for the architecture and open-source generation models to seed the library-a starting point for a cleaner, more diverse dataset. The long game: give future models better "memories" than the messy crawls they were raised on.

How to contribute (quick guide)

  • Generate images without gendered terms. Use "person," "professional," "leader," "doctor," "engineer," etc.
  • Mix age, body type, skin tone, and settings so a single "default" never takes over.
  • Avoid gender-coded cues in prompts (e.g., "strong, assertive CEO in a tailored men's suit").
  • Tag consistently: role, activity, environment, and neutrality (e.g., "role: engineer," "gender: neutral").
  • Upload with transparent usage notes. Keep logos and real identities out unless cleared.

Practical prompt and tagging ideas

  • Job roles: "a person leading a team meeting," "a professional presenting quarterly results," "a healthcare worker checking patient charts."
  • Visual balance: vary lighting, camera angles, and wardrobe so "leadership" and "care" aren't tied to a single look.
  • Tags to include: role, setting, emotion, action, neutrality, diversity attributes (without stereotyping).

Voices from 21N78E Creative Labs

Neeraj Rajeev, senior copywriter: "Even though we were aware of the bias, our understanding of it took a sharp turn when we saw its extent in image generation. That's when we realised the impact of what we had thought of was far wider. From that moment onwards, it became our sole mission to kickstart this initiative."

Sudhir Nair, founder and CEO: "At 21N78E, we've always believed that technology should be a mirror of our progress, not our prejudices. AI is an incredible tool, but it lacks the lived experience to know when it's repeating an old mistake. With UnbAIsed, we aren't just building a library; we're attempting to give AI a better set of memories to learn from. It's our way of ensuring that the digital future remains as diverse and nuanced as the real world we live in."

Viren Mahendra, national creative director: "As creators, we use images to build worlds, but if our AI tools only show us a world of the past, we're limited in what we can imagine for the future. UnbAIsed is our way of adding more inclusive, honest colors to that digital palette. It's about ensuring that when we look into the AI mirror, we see a reflection that is as diverse and nuanced as the reality we live in every day."

Nikhil Shahane, COO: "The challenge with generative AI isn't just the output; it's the data loops that reinforce it. By building UnbAIsed, we're moving from passive users to active contributors in the model-training ecosystem. We've utilised Gemini for the architecture and OSS generation models to seed the library, but the goal is to create a cleaner, more diverse dataset that 'un-teaches' the systemic biases found in older, unrefined crawls. It's about leveraging the right tech stack to ensure the visual intelligence of tomorrow is built on a more accurate representation of today."

How creatives can use it today

  • Replace stock stand-ins with neutral images from the library for briefs and pitch decks.
  • Build internal style guides that define "neutral defaults" for common roles.
  • Run a quick bias audit: test your top 10 prompts and swap in neutral variants.
  • Share a small, curated pack with partners so your whole pipeline stays consistent.

Further reading and resources

Bias hides in defaults. UnbAIsed gives you better ones. Use it, contribute to it, and push your work-and the industry-forward.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)