AI’s exploitation of IP & blockchain verifiability

AI’s exploitation of IP & blockchain verifiability

There are many debates currently raging around AI and the ways it has been used to violate IP laws. This was recently illustrated very clearly with the release of a track including the voices of Drake and The Weeknd. These are the top two artists for the largest record label on earth, Universal Music Group (UMG). The track "Heart On My Sleeve," created by TikTok creator Ghostwriter977, went viral with 275,000 YouTube plays and 625,000 Spotify plays before being removed. How do we address use of voice from an IP perspective? Asking for artists everywhere.

Name Image Likeness plus Voice?

Their voices were masked over an AI-generated song emulating the vocal styles of these artists, in direct violation of NIL and IP owned by UMG.  One observer, Audiomack’s Head of Revenue, Dave Edwards, suggested on Twitter: “UMG has the toughest copyright team around. You couldn’t pick two artists who are going to provoke a stronger response than [Drake and The Weeknd]. Suspect they’ll drop the hammer on whatever distributor put this on Spotify.”

In the US, although they vary from state to state, NIL (Name Image Likeness) laws have been in force for many decades, but with the advent of AI, it is time to start looking at ways to add Voice to NIL and/or copyright laws.  The question of protecting voices from unlawful use is a complex one. While it's true that voices cannot be copyrighted, there are still certain protections in place to prevent their misuse.

Specific state laws can provide relief for the average person but these vary greatly from state to state in the US. For example, California has passed a law that prohibits the use of deepfakes, which are AI-generated mimicries of one person's face onto another's, during elections to mislead voters. Similarly, Virginia has a revenge porn law that bans the use of deepfakes in pornography. This law received unanimous support in both the State Senate and House.

But what about celebrities? For them, trademarking their voice presents a greater opportunity for success in protecting their voice. According to 15 U.S.C.A. § 1051, the use of a voice could be protected over a variety of categories. However, registering a voice in multiple categories is required, similar to copyrighting a voice. Nevertheless, this does not wholly prevent someone from using another's voice.

Despite this, celebrities still receive the greatest amount of protection. Their use is prevented via the tort of the right of publicity, which allows them to choose how their likeness is used for commercial gain. While these protections may seem adequate, there are still instances where the unlawful use of someone's voice can go unpunished. This highlights the need for more comprehensive and effective measures to protect people's voices from being misused.

In response, UMG asked for the content to be taken off all DSPs, and issued a statement:

”UMG’s success has been, in part, due to embracing new technology and putting it to work for our artists–as we have been doing with our own innovation around AI for some time already.
With that said, however, the training of generative AI using our artists’ music (which represents both a breach of our agreements and a violation of copyright law) as well as the availability of infringing content created with generative AI on DSPs, begs the question as to which side of history all stakeholders in the music ecosystem want to be on: the side of artists, fans and human creative expression, or on the side of deep fakes, fraud and denying artists their due compensation.
[This] begs the question as to which side of history all stakeholders in the music ecosystem want to be on: the side of artists, fans and human creative expression, or on the side of deep fakes, fraud and denying artists their due compensation. These instances demonstrate why platforms have a fundamental legal and ethical responsibility to prevent the use of their services in ways that harm artists. We’re encouraged by the engagement of our platform partners on these issues–as they recognize they need to be part of the solution.”

Last month, UMG also emailed streaming services, including Spotify, to block AI services from accessing music catalogues for AI training purposes.

This week at the request of UMG, Spotify removed tens of thousands of songs generated by AI music startup, Boomy.  Representatives from Boomy said the platform is against manipulation or artificial streaming of any kind.  Artificial streaming is where bots pose as listeners and is a long-running industry-wide issue that streaming services like Spotify are trying to policy and stamp out.  With the advent of AI bots, it is only going to get harder to police artificial streaming as AI will make setting up bots and accounts even easier and faster.

Universal Music Group CEO, Lucian Grainge, said in a comment to investors:

“The recent explosive development in generative AI will, if left unchecked, both increase the flood of unwanted content on platforms and create rights issues with respect to existing copyright law”.

While music industry giants are fighting AI for control of copyright and IP, other artists like Grimes are championing the technology. The musician  has said she would allow creators to use her voice and be a “guinea pig” for AI music creation as long as royalties were split 50-50 as is her standard agreement when collaborating with other artists.

As the AI hype cycle and narrative continues at pace, at Faculty Entertainment, we believe that both artists and artist management teams are missing the intersection between web3 technology and its ability to ensure the safety and protection of IP creators in every industry. Blockchain and web3 technology is built as a tool for immutable verification. As AI scrapes the interest, re-purposes IP, and blurs the lines between generative and creative ‘ART’, human creators, artists and authors will be left out of receiving authorship, credit, monetization or worse, maybe misappropriated by GenAI itself or unwitting be used by these LLMs without their consent.

Web3 can provide an immutable record of ownership and IP for Name, Image, Likeness and Voice which can be registered on-chain and is verifiable.  It could also prevent LLMs from using their data without their permission. AI makes the task of verifying ownership of data and monetisation of data much more urgent.  Up until now, Big Tech has scraped the internet without having to reimburse humans, corporations, or governments for the data they have ‘used’ or ‘borrowed’ for their LLMs.  This needs to change going forward especially when it comes to copyright infringement and using their ‘data’ without permission.

Licensing and copyright laws, and government regulations need to catch up with AI and there needs to be an open conversation about how we value creativity AND innovation. We also need to protect the rights of individuals and corporations to own and monetise their ‘product’ which could be an artist, their music catalogue and/or their voice!

Andy Anderson

Andy Anderson

Andy Anderson is Head of Faculty Entertainment, our digital asset studio. He helps brands and entertainers build new web3 revenue streams, bringing fans and artists together in a meaningful way.