July 1

How AI-Generated Music Hurts Real Artists

AI, Finance, Musicians, Spotify

AI-generated music is probably among the most convincing of the current world of AI slop. And it's in a lot more places than you may realize, whether its underscoring videos or taking up a spot on your weekly playlist. The thing is — with how streaming companies pay artists, it's not all just fun and games. When this music is released via platforms like Spotify, it directly hurts real artists.

A New "Break-Through Artist" Is (Probably) AI

If you haven't heard by now, a probably AI-generated band has racked up more than half a million monthly listeners on Spotify. Dubbed "The Velvet Sundown," everything about the project screams AI. Aside from a series of tweets from the otherwise anonymous band swearing they're real, there's nothing indicating The Velvet Sundown actually exists. 

For starters, their images on Spotify have all the signs of AI generation, from nondescript features to an almost slightly cartoonish sheen. The bio, besides reading like it was written by a fourth grader using a thesaurus, checks out as 85 percent AI-generated (at least according to, well, AI). 

Neither of those things in and of themselves mean the entire band is fake, of course, but diving into the actual music itself reveals more. Producer and YouTuber Rick Beato made a video where he pulled some of the stems from the songs and identified the types of artifacts that you hear almost exclusively in AI-generated music. Watch the video yourself if you want a more in-depth explanation. Music Ally has also done some great analysis, most recently with a tool reporting that at least 10 of the bands 13 most recent tracks come back as AI-generated with 100 percent confidence, while two more come back with 98 percent confidence, and the last with 76 percent confidence. 

And then there's the songs themselves. The Velvet Sundown's most-streamed song as of right now is called "Dust on the Wind," and if you're saying to yourself, "Hey that title sounds strikingly similar to the incredibly popular classic rock hit 'Dust in the Wind,'" well, just wait until you actually hear the song.

Actually, don't go listen to it. Take our word for it that it's a very clear knockoff of that tune that most real artists would be embarrassed to claim. This brand new band also appears astonishingly prolific, too, because they've already got their third full album slated for release on July 14th — which will make three full albums released since June 5th, 2025. 

Spotify's Take On AI-Generated Music

Here's the thing, though. Even if The Velvet Sundown is a fully AI-generated project, whoever created it (and is profiting off it) isn't really breaking Spotify's terms of service. In fact, Spotify is supposedly bloated with AI-generated music, and some say it's ruining the platform. 

Spotify technically prohibits AI music that imitates other artists. In other words, creating AI versions of famous people "singing" songs they've never actually recorded. But what if the artist just sounds very similar to another artist but isn't claiming to be them? That's where it gets trickier. 

And so far, Spotify hasn't done much to address the concerns. Especially not when you consider how aggressively they've battled bot streams in the past few years. The platform can claim that not that many streams are for AI-generated music on the whole, but there are still millions and millions of streams going to AI music. The Velvet Sundown isn't even the most popular (probably) completely fake artist. 

How AI-Generated Music Directly Hurts Artists

So why exactly does this matter? Because thanks to the way streaming services pay artists, every stream that goes to a fake song ultimately dilutes the revenue pool for real artists. 

Yep — all rights holders split money from a pool of revenue. That's currently around 70 percent of the revenue Spotify earns from subscriptions and advertisements etc. So, technically, the more users consume music on Spotify, the less some artists may get paid, assuming Spotify's profits are not also growing. 

And that's how an AI artist, which almost certainly does not drive any additional revenue towards Spotify, ultimately dilutes the pool of money going to real artists. Every single stream of The Velvet Sundown takes a little money out of the pockets of every other artist. Sure, it's microscopic in this singular instance, but when you zoom out and see millions of streams going to fake artists, you start to see how it's a much bigger problem. And with no guardrails in place currently, there's nothing really stopping the problem from proliferating. 

How Music Could Be The Main Medium To Dismantle Generative AI

As we've discussed in the past, the law is incredibly muddy right now around generative AI. Two recent rulings also complicate things even further. Some lawsuits have been dismissed, others are still early in the journey. And in the meantime, Disney and Universal have filed a lawsuit against artificial intelligence company Midjourney, presenting perhaps the biggest legal challenge yet.

But while we have very little idea where this is going legally, one of the key issue brought up so far is whether or not generative AI is leading to a competitive disadvantage for the copyright holders its trained on. 

While it may be hard for a fiction author to prove that ChatGPT puts them at a competitive disadvantage, it's much easier for artists to draw a direct line between AI-generated music and money out of their pockets. Not to mention the fact that AI-generated music also competes for ears through things like algorithmically generated playlists and radio stations. You could make a similar case for platforms like TikTok, which pay creators from a pool of revenue.

Judges have yet to rule on whether or not the way companies trained their products — by effectively stealing content — was legal or not. But they have at least in one instance indicated that AI models being trained on copyrighted content is not inherently illegal. That means it's going to require music rights holders to step forward and show how artificial music puts them at a competitive disadvantage and hopefully start putting guardrails on people trying to make a quick buck with AI slop.


MORE STORIES FOR YOU

Latest Judge Rulings Muddy The Waters On AI Training Legality

A U.S. judge has ruled that Anthropic’s use of copyrighted materials in its AI training falls under the doctrine of “fair use.” The judge did not, however, absolve Anthropic of illegally pirating those materials before deciding to buy copies of them.  The ruling directly contradicts another judge’s finding earlier in 2025, which determined that training AI

Read More

Three Full-Time Creators Talk Making Content After College Athletics

During a panel at AthleteCon, full-time creators Emily Harrigan, Caden Davis, and Lacey Jane Brown discussed making content in a life after college athletics. The trio got candid about the ups and downs of making a living as a content creator and offered some tips to anybody else looking to make the leap.Practical Tips For

Read More

Have You Used The New Edits App From Instagram?

Meta recently launched the new Edits app in an effort to beef up their video offering and compete with the most popular mobile editing apps like CapCut. Officially dubbed “Edits, an Instagram app” in various app stores, the new platform might just be worth checking out for mobile-first creators. Meta’s Move Into Mobile EditingMeta has made

Read More

Understanding The $2.7B NCAA Settlement

Well, it officially happened — a federal judge approved the $2.7 billion NCAA settlement known as House v. NCAA. What does it mean for former, current and future student athletes? Are people happy? Angry? Confused?  Let’s take a look. Origins Of The NCAA SettlementHouse v. NCAA was the consolidation of three separate lawsuits from nearly 400,000 current and

Read More

Never miss a good story!

 Subscribe to our newsletter to keep up with what's going on in content creation!