Musicians Voice Frustration Over AI Clones

▼ Summary
– AI-generated music clones are proliferating, targeting artists from Beyoncé to King Gizzard, causing anger and resignation within the music industry.
– Streaming platforms like Spotify are struggling to contain the problem, with services like Deezer ingesting tens of thousands of AI tracks daily due to lax distribution screening.
– AI-generated songs are achieving misleading chart success, as seen with “Breaking Rust” and “Solomon Ray,” sparking backlash over their lack of human artistry and spirit.
– The situation has led to exploitation attempts, such as a producer falsely claiming an AI track was by Jorja Smith, prompting legal demands for royalties.
– Industry responses are divided, with some labels warming to AI, while artists and allies like iHeartRadio advocate for human creativity and push for regulatory action to protect musicians.
The music industry is facing a relentless new challenge as artificial intelligence clones and fake songs flood streaming platforms, sparking widespread frustration and anger among musicians. While AI-generated impersonations are not entirely new, the scale and brazen nature of these incursions have escalated dramatically, leaving artists feeling both furious and helpless. From global superstars to niche experimental composers, no one seems immune. The situation highlights a critical vulnerability in the digital music ecosystem, where distribution pipelines and platform policies struggle to keep pace with rapidly evolving synthetic media.
The problem has intensified over the last two years, moving beyond isolated novelties like the AI Drake tracks of 2023. Recently, artists ranging from Beyoncé to ambient composer William Basinski have discovered fraudulent songs, likely AI-generated, listed under their names on major services. This week, the psychedelic rock band King Gizzard and the Lizard Wizard became the latest target, with frontman Stu Mackenzie expressing a mix of anger and grim resignation. Platforms are attempting to respond; Spotify has formalized a policy against impersonation and removed tens of millions of spam tracks. However, the sheer volume is staggering. Competing service Deezer reports that over 50,000 AI-generated tracks are uploaded to its library daily, accounting for more than a third of all new music it processes.
A significant loophole exists because music is typically uploaded through third-party distribution services like DistroKid, not directly to streaming platforms. It remains unclear what verification, if any, these distributors perform to confirm an uploader’s identity. This gap allowed a seemingly AI-generated reggaeton track to appear on the Spotify page of William Basinski, an artist known for ethereal soundscapes. Basinski dismissed the incident as “total bullshit,” relying on his label to monitor such “idiocies.” Other artists have shared similar dismay. Luke Temple of the dormant band Here We Go Magic saw his group’s Spotify page reactivated by AI impostors, calling the fake track “so awful.” When a song called “Name This Night” appeared on legendary band Toto’s page, guitarist Steve Lukather condemned it as “shameless.”
While some fakes may not use AI, the technology makes producing convincing forgeries faster and easier than ever. Tools like Suno, though designed to ignore prompts for specific artists, can still generate complete songs from minimal input, lowering the barrier for bad actors. The issue isn’t limited to direct impersonation. Artist Blanco Brown accused the creators behind the track “Breaking Rust” of using AI to model his distinctive vocal style, calling it an attempt to create a “white version” of him. His manager emphasized that AI cannot replicate a lifetime of human experience, emotion, and artistic conviction.
“Breaking Rust” gained notoriety by reaching the top of Billboard’s Country Digital Song Sales chart, sparking misleading headlines about AI dominating country music. However, this specific chart measures iTunes purchases, a market where very few sales are needed to chart highly; the song reached number one with roughly 3,000 purchases, suggesting possible manipulation. Another entity, an AI gospel creation named Solomon Ray, also found chart success, prompting backlash from the Christian music community. Critics argued that AI lacks soul and spirit, with one artist stating it “does not have the Holy Spirit inside of it.” A real singer named Solomon Ray expressed concern, questioning how much heart can be in music generated without human input.
Some opportunists are attempting to capitalize on the controversy itself. A producer known as Haven went viral by implying a track with AI-manipulated vocals was an unreleased song by Jorja Smith. After the fake was removed from streaming services, the producer tried to monetize the attention by rerecording the song and even approaching Smith for a remix. In response, Smith and her label are now pursuing royalties, stating that creators have become “collateral damage” in the race for AI dominance.
Organized labor is taking a firm stand. The United Musicians and Allied Workers union has labeled AI music “exploitation,” arguing that AI gives platforms and labels a tool to cut human artists and their royalties out of the equation entirely. Union organizer Joey La Neve DeFrancesco points to existing deals between major labels and AI companies as evidence of this troubling shift. In contrast, some industry players are siding with artists. iHeartRadio has publicly pledged to never play AI-generated music with synthetic vocalists pretending to be human, nor use AI-generated on-air personalities, declaring it is “on the side of humans.”
Even artists who embrace AI technology are sounding alarms. Holly Herndon, who has used AI extensively in her own work, has warned fellow musicians about exploitation. She found that when raising concerns about training data and artists’ rights with AI companies, many were surprisingly dismissive and unprepared for the ethical backlash. This underscores the urgent need for structural solutions. Advocates are pushing for regulations that would force streaming services to clearly identify AI content and exclude it from royalty pools shared with human artists. Legislation like the proposed Living Wage for Musicians Act aims to create a new royalty stream paid directly by platforms, one that would exclusively benefit human creators.
For the moment, the burden of vigilance falls heavily on artists and their fans. In an era where audio can be as easily fabricated as photos and video, a healthy dose of skepticism is becoming essential for anyone navigating the modern music landscape.
(Source: The Verge)
