MBW Views is a series of op/eds from eminent music industry people… with something to say. This MBW Views op/ed comes from Ed Newton-Rex (pictured), CEO of the ethical generative AI non-profit, Fairly Trained.
Ed’s opinion piece follows the commotion over the weekend surrounding an update spotted in SoundCloud’s T&Cs requiring users to “agree that [their] Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies”.
SoundCloud has issued a response (which you can read in full over on the Verge) to clarify its policies around AI, stating that the platform “has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes”.
Over to Ed…
When it emerged last week that SoundCloud have updated their terms of service to permit them to train AI models on users’ music, I was keen to give them the benefit of the doubt.
I’ve been a SoundCloud user for years, and I admire so much about what they’ve built. I wanted to believe this was an honest mistake, some mistranslation as a terms update went into effect.
Unfortunately, even for a platform that calls itself artist-first like SoundCloud, it looks like the allure of a trove of training data is just too great to be turned down in the age of generative AI.
The terms update dates all the way back to February 2024, but it went largely unnoticed until now. The important part is this:
“You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services.”
When people started calling them out on this last week, their initial statement in response was couched in artist-friendly language:
“SoundCloud has never used artist content to train AI models”.
“Any future application of AI at SoundCloud will be designed to support human artists”.
This was enough for some. But I thought it was notable for what it didn’t say. It didn’t say they weren’t planning on training generative AI on their users’ music in future.
And in these circumstances, if you weren’t planning on training on your users’ music, you would sure as hell say so.
Thankfully, some pushed them on this. And their response to The Verge laid out what some of us had suspected but not wanted to believe (emphasis mine): “Importantly, no such use has taken place to date, and SoundCloud will introduce robust internal permissioning controls to govern any potential future use.”
In other words, they have not ruled out using their users’ music to train generative AI models in the future – and their terms of service explicitly allow it.
And, to make matters worse, they are already hinting that users will have to opt-out of this arrangement, rather than SoundCloud asking users’ permission: “Should we ever consider using user content to train generative AI models, we would introduce clear opt-out mechanisms in advance—at a minimum”.
This is again worded to try to sound artist-friendly, but it is nothing of the sort. Opt-out mechanisms for generative AI training are hugely unfair to musicians, for a heap of reasons I’ve talked about before.
This is why the vast majority of musicians reject them.
This is already enough to make me delete my SoundCloud, which I’ve done today. A streaming platform should not be exploiting its users’ music to train generative AI models without their explicit permission. This is non-negotiable.
But there are two further reasons I’m particularly worried about SoundCloud’s actions here, which reinforced my decision to take my music down and quit the service.
The first is that they don’t seem to have told users when they made this change to the terms. I’ve trawled my emails, and I can’t find anything. Nor can a bunch of other people I know.
If you’re going to start reserving the right to use people’s music for something totally orthogonal to what they signed up to the service for – particularly something as inflammatory as AI training – you have to tell them. I mean, I suspect legally you have to tell them, and I’m sure questions will be asked here. But irrespective of that, it’s just clearly wrong to do this without telling people.
And the second – and this is the one I think is really bad – is that they’re treating artists without a label worse than those who are signed.
“The TOS explicitly prohibits the use of licensed content, such as music from major labels, for training any AI models, including generative AI,” they say.
“For other types of content uploaded to SoundCloud, the TOS allows for the possibility of AI-related use.” To SoundCloud, artists without a label are apparently second-class citizens, who don’t deserve the same protections as artists who are signed. Now, this could well be due to language in their deals with the major labels. And I don’t fault the major labels for getting that language in there, if so.
But why on earth would a platform that is supposed to pride itself on existing to serve all musicians, whose raison d’être surely is to let anyone’s music be heard – why would such a platform treat unsigned artists so poorly, and subject them, and only them, to involuntary AI training on their music?
A lot of the music community seem incensed, and it’s easy to see why. This is an awful policy, unfair at its core, and disproportionately unfair to unsigned artists. I know I’m not the only musician who has deleted their music from SoundCloud in recent days.
There is still time for SoundCloud to rectify this. It’s possible that they don’t really have any plans in generative AI, that these updated terms and their recent statements are a hedge more than anything else, and when they understand how their users feel they’ll realise that turning people’s music into training data isn’t worth it.
For me to return, it would take a commitment not to train generative AI on users’ music, and an update to their terms that sets that in stone.
Given their response to the unfolding saga so far, I’m not optimistic. Generative AI seems to have the capacity to make companies forget their purpose and why their users love them.
But I’m also hopeful that there are enough employees at SoundCloud who are repelled by the idea of training on their users’ music without explicit permission, and that their voices will make a difference. You have to hold out hope.Music Business Worldwide