Chart-topping AI country act Breaking Rust hit with twin rip-off claims after brief Spotify takedown

Breaking Rust scored a hit, then got accused by two artists of stealing voice, style, and image. It's a preview of messy AI disputes-copying many at once, proof in the gray.

Categorized in: AI News Creatives
Published on: Dec 05, 2025
Chart-topping AI country act Breaking Rust hit with twin rip-off claims after brief Spotify takedown

Two artists say AI act Breaking Rust ripped them off. They won't be the last.

Breaking Rust, an AI-generated "outlaw country" act, landed a semi-hit with Walk My Walk: 7 million+ Spotify streams and a No.1 on Billboard's Country Digital Song Sales chart. Then the track briefly vanished from Spotify after an impersonation claim, before getting reinstated.

Two artists say the track crosses a line: one points to voice and style; the other to branding and image. The case exposes a bigger problem for creatives: AI releases can mimic multiple artists at once-and proving who was copied isn't straightforward.

1) Bryan Elijah Smith: "stealing elements of my music, my style, and even my image"

Independent country artist Bryan Elijah Smith says Breaking Rust (and other AI acts) copied niche genre language, branding, and visuals he's built over 17 years. He also alleges these acts present fully AI-generated songs as original work.

Smith says he filed a rights/impersonation claim against Walk My Walk, which he believes triggered its brief removal. Spotify's platform rules allow action against impersonation and misleading content; reported tracks can be removed and later reinstated after review or appeal. See Spotify's Platform Rules.

His broader point: anonymous AI creators can imitate real artists, flood release pipelines, and pull from the same limited revenue pool-without disclosure or accountability. He's pushing for full account removals, not just takedowns.

2) Blanco Brown says the song mimics his sound-and responds with a cover

Grammy-nominated artist Blanco Brown says he learned about the track from fans who heard his vocal style in it. His manager says Brown responded by covering the AI track to show the difference between lived experience and a generated formula.

Brown also released a reworked version with new lyrics and arrangement: "If someone is going to sing like me, it should be me." For him, this is cultural and legal-he's spent years crossing genres and still faces barriers on country radio, making imitation without credit or consent hit harder.

3) The credited songwriter link raises new questions

All 10 tracks on Breaking Rust's Spotify page, including Walk My Walk, credit Aubierre Rivaldo Taylor as songwriter. Reporting has tied Taylor to Defbeatsai, a viral AI "X-rated" country project, and further connected that to a former Blanco Brown collaborator, Abraham Abushmais.

Brown says no one told him about any link between the AI track and a past collaborator, and he hasn't been able to reach him. The chain highlights how fast AI projects can spin up, change names, and move beyond traditional accountability.

This isn't isolated-and that's the real issue

We're seeing more cases. A viral dance track was allegedly made using a label artist's voice via AI, leading to removals and calls for mandatory labeling. Another major artist flagged an AI act with a near-identical name and uncanny sonic overlap.

What's unique here: two different artists claim the same AI act ripped them off in different ways-voice for one, brand and identity for the other. Expect more multi-artist disputes per track, murky training data trails, and repeated takedown ping-pong.

Want to track mainstream reporting on AI music disputes? Check the Associated Press for ongoing coverage: AP News.

Why this matters for creatives

  • Style theft scales with AI. One prompt can approximate your voice, tone, brand language, and visual cues.
  • Proof is hard. Even if a track sounds like you, showing you were the training source can be tricky.
  • Platforms act, then reverse. Content can be removed, appealed, and reinstated. Process takes time.
  • Algorithms don't care. If AI tracks fit the right tags and vibes, they siphon attention and payouts.

What to do now (practical steps)

  • Lock your identity: register trademarks for your artist name, key marks, and unique brand phrases where possible.
  • Watermark and document: keep dated session files, stems, and drafts. Consider subtle audio watermarks for signature elements.
  • Publish clear policies: add a public statement banning unauthorized AI training or vocal cloning of your work.
  • Monitor and move fast: set alerts for your name, song titles, and signature phrases. Save evidence (links, uploads, metadata, screenshots).
  • Use platform tools: file impersonation or takedown requests with specific timestamps and comparisons. Reference platform rules when you submit.
  • Own your voice: if an AI track is gaining steam off your likeness, consider a sanctioned "real" version to reclaim narrative and discovery.
  • Label your use of AI: if you use AI in your own process, disclose it. Transparency builds trust and inoculates against future disputes.
  • Stay educated: laws and platform policies are changing. If you work with teams, align on a standard response playbook.

Want structured learning on AI for your craft?

Explore practical training made for different roles and workflows: AI courses by job.

The takeaway

AI music isn't a fringe experiment anymore-it's competing for charts, streams, and your audience. The Breaking Rust situation shows how fast things get messy when style, voice, and identity blur.

Protect your name, build receipts, and be ready to move. The artists who combine strong brand control with smart AI literacy will keep their edge as these cases pile up.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide