Photo by AS Photography
This article is based on reporting originally published by The Guardian on 3 June 2025, written by Eamonn Forde.
Amid mounting concerns over the integrity of the digital music economy, Spotify, Apple Music, and other major platforms are facing criticism over a surge in streaming fraud, a crisis increasingly impacting independent artists. As AI-generated music, bot farms, and opaque moderation systems grow more sophisticated, legitimate musicians are being caught in the crossfire, raising fundamental questions about fairness, accountability, and the future of streaming.
The issue, first spotlighted by industry insiders and artists themselves, involves a shadow economy of fake tracks – often generated by AI – being uploaded en masse to streaming services. According to Deezer, more than 20,000 fully AI-created tracks were added to its platform daily in April 2025 — a near twofold increase since January. These fake songs are then propped up by bots or low-wage click-farm workers, designed to game royalty systems and siphon funds intended for legitimate creators.
Compounding the issue is a new, subtler trend: fraudulent uploaders placing fake songs directly onto real artists’ profiles, allowing them to harvest the associated royalties while confusing fans and triggering automated takedowns of actual content.
In statements issued in response to rising scrutiny, Spotify claims it devotes “significant engineering resources and research” to tackling artificial streaming, while Apple Music says that “less than 1% of all streams are manipulated.” But in a global streaming market valued at $20.4 billion (according to the IFPI), even marginal percentages equate to hundreds of millions of dollars at stake — much of it potentially misdirected.
The fallout is already being felt. Artists whose music sees sudden, unexplained spikes, whether due to viral success or AI manipulation, are finding their work removed, often without notice or clear explanation.
Darren Hemmings, founder of music marketing agency Motive Unknown and a musician himself, experienced this firsthand. After one of his tracks jumped from a handful of daily plays to over 1,000, his distributor flagged it for manipulation. “I wouldn’t blame them for drawing that conclusion,” Hemmings said, “but it’s very judge, jury, executioner. I didn’t do anything — and I couldn’t figure out who did.”
The same happened to Northern Irish rock band Final Thirteen, who suspect a BBC Radio 1 play led to a spike in listens. But instead of celebrating a breakthrough moment, they were met with takedowns. “[Distributors] take it down and that’s it,” said drummer Doobes. “It’s really hard for any artist to prove that they didn’t [manipulate streams], but it’s even harder for Spotify to prove that they did.”
Similar incidents have impacted artists like Adam J Morgan, known as Naked & Baked, who had a track taken down after hitting 10,000 streams in a week — likely due to exposure via TikTok. His distributor, RouteNote, flagged it, but Spotify said it had no issues. RouteNote did not respond to interview requests.
Takedowns can derail entire release campaigns. For Matthew Whiteside, artistic director of The Night With…, an experimental classical label, three albums were pulled after being added, without his knowledge, to suspicious playlists. Resubmitting them cost $40 per album, with no guarantee the same wouldn’t happen again. “Streaming in general is geared against the smaller and the niche,” he said.
While Deezer claims to be leading in fraud detection through manual verification and pattern analysis, other services rely heavily on automation — leaving little recourse for those wrongly flagged. “With streaming services, it’s almost impossible to [appeal] through them,” said Levina, a pop artist and chair of the Featured Artists Coalition’s artist council. Her music was removed after she was mistaken for another artist with the same name. “You fill out a form, but it leaves you quite powerless.”
In an attempt to restore fairness, the Featured Artists Coalition is developing minimum standards for distributors, including a traffic-light system for flagging suspicious activity, giving artists a chance to respond before punitive action is taken.
Behind the scenes, platforms and distributors now admit that the battle is about managing fraud, not eradicating it. Darren Owen, COO of distribution company Fuga, says identifying fraud has become 50% of his daily workload. His team uses AI to spot “non-human listening patterns” and assigns “severity scores” to streaming behavior. “You’re not going to listen to the same song at the same time across multiple devices,” Owen noted, pointing to hotspots like India, Vietnam, Thailand, and parts of Eastern Europe.
And as tactics evolve, fraudsters are now boosting thousands of tracks by small margins to avoid detection – a strategy that could widen the divide between major label artists and independent musicians.
For many in the indie world, the frustration is leading to a hard reassessment of the streaming model altogether. “This could provoke a conclusion among large swathes of the independent music community that they’re just better off focusing on other ways to make money,” said Hemmings.
Whether that future involves Bandcamp, direct-to-fan models, or something entirely new, one thing is clear: trust in the current streaming infrastructure is eroding, and creators are demanding more transparency, more control, and a seat at the table in defining what comes next.



