AI-Generated Songs Appearing Under Deceased Artists on Spotify
The use of artificial intelligence in music has sparked debates, but a recent issue on Spotify has brought new concerns to light. Reports have shown AI-created songs uploaded to official Spotify pages of late artists, including country legends Blaze Foley and Guy Clark. Many see this as both misleading and disrespectful.
Fake AI Tracks Mimicking Iconic Artists
Blaze Foley, a respected country singer-songwriter who died in 1989, had a song titled “Together” uploaded to his official Spotify profile. This track, created by AI, imitated Foley’s signature style with elements like male vocals, piano, and electric guitar. The cover art was also AI-generated and did not resemble Foley.
Craig McDonald, owner of Lost Art Records, which manages Foley’s music rights, called the track “an AI schlock bot” and said it lacked any real connection to the artist. Spotify removed the track, citing violations of its content policies. The platform confirmed it prohibits impersonation intended to mislead, including copying a creator’s name, image, or profile in a deceptive way.
Guy Clark, a Grammy-winning songwriter who passed away in 2016, was similarly affected. An AI-generated song appeared under his name with a fake image that did not look like the late artist.
Spotify’s Reaction and User Concerns
Spotify promised to act against licensors and distributors who don’t prevent this type of fraud, warning of permanent bans for repeat offenders. While some praised Spotify’s quick removal of the unauthorized tracks, many users remain uneasy. One Redditor demanded an AI filter to avoid such content, claiming multiple AI songs appeared in their “Discover Weekly” playlist.
Currently, Spotify has no tagging system for AI-generated music and has not publicly explained how it identifies these tracks. Requests for clarification from Spotify have so far gone unanswered.
Legal and Ethical Questions
This incident highlights broader issues around regulation and transparency in the music industry. Streaming services are not legally required to label AI-generated music, but calls for change are growing.
Sophie Jones, Chief Strategy Officer at the British Phonographic Industry, urged the UK government to protect copyrights and introduce transparency rules for AI companies. She emphasized the need for clear labeling of AI-generated content. This aligns with demands from fans and professionals for stricter standards, especially when AI risks impersonating real artists, whether living or deceased.
What This Means for Creatives
The appearance of unauthorized AI-generated music under the names of late artists exposes a gap in how digital platforms handle synthetic content. Spotify’s response is a start, but the situation points to a need for transparency and possibly regulation.
For those in creative fields, this raises important questions about authenticity and rights. Knowing whether music is human-made or AI-generated matters — not just for fans, but for everyone involved in creating, managing, and distributing music.
As AI tools become more accessible, creatives should stay informed about how these technologies are used and advocate for clear labeling and fair practices. For those interested in learning more about AI and its impact on creative work, exploring educational resources can be valuable. For example, Complete AI Training offers courses that cover AI’s role in various industries, including music and creative arts.
Your membership also unlocks: