Liza Minnelli joins artist-approved AI album - what it means for creatives
ElevenLabs just released "The Eleven Album," a project built with explicit permission from 13 human artists - including Liza Minnelli, Art Garfunkel and Michael Feinstein. The pitch: artists keep control, blend their sound with AI, and publish work they authorize.
Minnelli put it simply: music is about connection and emotional truth. She signed on because the project used AI as a tool - not a replacement - and respected voice, choice and ownership.
What's actually new here
Consent is front and center. ElevenLabs runs a marketplace where public figures can license AI versions of their voices. Brands and creators must request approval before using those voices in campaigns or creative work.
The company says its music model is trained on licensed music, not scraped data. It also embeds a "sonic fingerprint" - a watermark that identifies a voice as generated by ElevenLabs. And according to the company, creators who make music on the platform keep ownership of their output.
Why this matters right now
Trust is the battleground. Creatives have watched high-profile flare-ups - Scarlett Johansson objecting to a GPT voice similar to hers, Drake pulling a track after the Tupac estate threatened legal action, and debate around SAG-AFTRA's deal with Replica Studios to license replicated voices for games.
At the same time, big music labels are cutting deals with AI studios. Universal Music Group and Warner Music Group reached licensing agreements with Stability AI and Udio, and major labels also announced AI licensing with Klay. Legal fights are turning into contracts.
Streaming is already flooded with AI
AI acts can rack up real numbers before anyone notices. The Velvet Sundown drew over a million Spotify plays before admitting the project was fully AI. Other suspected AI artists have gathered millions of monthly listeners.
Listeners want clarity. Thousands voted for Spotify to label AI tracks and give users a filter. Spotify has said it's rolling out clearer disclosures in song credits and stronger impersonation rules. See platform guidance on AI and impersonation policies here.
Consent and safety aren't solved - but they're improving
Consumer Reports found that leading voice-cloning tools had safeguards that could be bypassed - often just a checkbox for "authorization." ElevenLabs says its pro cloning now requires identity verification and a timed voice prompt match, plus watermarking to tag generated voices. The scrutiny helps: pressure leads to better guardrails. For context, read the Consumer Reports piece here.
How to use this moment as a creative
- Get your consent layer in writing. If a tool uses your voice or likeness, insist on written terms: scope, duration, revocation rights, and revenue splits.
- Use platforms with verification and watermarking. Favor tools that require identity checks and embed detectable fingerprints.
- Label your AI involvement. In liner notes and credits, state how AI was used. It builds trust with fans and collaborators.
- Protect your voice prints and stems. Store originals securely; monitor for impersonations and file takedowns fast.
- Price your voice and style. If you license your voice, set clear rates (usage, geography, exclusivity) like you would for sync or session work.
- Keep a human editorial layer. AI can generate options; you make the final call on taste, edits, and release.
- Audit datasets when possible. Ask vendors: what training data is licensed, and how is rights clearance handled?
If you want to experiment without burning your brand
- Start with co-writes, not full automation. Draft melodies, harmonies, or arrangements with AI; keep performance and direction human-led.
- Create a separate project name for heavy AI work. Avoid confusing your core discography; tell fans what they're hearing.
- Use a release checklist: rights cleared, watermark on, disclosures in credits, contracts signed, distribution notes updated.
- Test small audiences first. Share snippets with trusted listeners; ship the versions that pass the vibe test.
- Share upside with collaborators. If an AI-assisted track lands, split fairly with writers, producers, and featured voices.
Artist perspectives
Minnelli joined because the process respected the artist. Feinstein echoed that the tool is only as good as the direction behind it - AI offers options; creators make the choices.
Bottom line
AI isn't replacing creative judgment. It's adding more drafts, faster. The projects that win will pair clear consent and labeling with strong taste and curation.
If you want structured training on tools, ethics, and creative workflows, explore up-to-date options here: Latest AI courses.
Your membership also unlocks: