Subscribe to World Briefings's newsletter

News Updates

Let's join our newsletter!

Do not worry we don't spam!

Business

Hollywood's AI Dilemma: Why Studios Are Still Hesitant to License Their Content

30 August, 2024 - 8:06AM
Hollywood's AI Dilemma: Why Studios Are Still Hesitant to License Their Content
Credit: our-hometown.com

Hollywood’s major studios are confronted with a choice: whether they should license content from their storied catalogs of movies and TV shows to train AI video models — and, if so, exactly how that should be done.

Companies ranging from Paramount Global to Sony Pictures, not to mention other prominent film and TV distributors, have been notably absent from any confirmed dealmaking that has occurred over the past year between publishers and AI companies (which VIP+ has gathered in an updated chart). However, smaller deals to license film and TV content for AI training from alternative sources for premium video are quietly gearing up, as VIP+ has also covered.

Yet the studios have been fielding overtures from major tech companies to discuss possible deals, a source with knowledge of the matter told VIP+. Bloomberg reported in May that OpenAI, Meta and Alphabet had each engaged the studios in talks, citing that Warner Bros. Discovery had licensed a few of its TV shows, while Netflix and Disney declined.

But exactly why the studios have withheld their content from AI developers is a bigger question with a complex set of answers.

VIP+ spoke to several people who’ve had direct advisory interaction with studios and knowledge of how they are perceiving the opportunities and risks of licensing, including consultants, agents and lawyers, to provide an in-depth exploration of studios’ different concerns. Reps for the major studios either declined comment or did not respond to inquiry for comment.

Studios’ stances toward generative AI are not monolithic. Broadly, strategies run along a spectrum of embrace versus reject, sources told VIP+. Three key options are being considered: licensing content to train AI models; suing AI developers, as other content owners have done, but where studios are also notably absent; or licensing AI models from the AI companies for the studios’ own internal use, including fine-tuning models on owned IP.

However, licensing content to AI developers and licensing tech from AI developers aren’t necessarily mutually exclusive options for studios. (Access an updated list of U.S. lawsuits brought by publishers, authors and artists against AI companies.)

Opinions on licensing content to AI companies likely vary within specific studios, but the following reasons consistently emerged among stakeholders charged with weighing their options.

No Deal Precedents

Defining terms for this type of content use for AI training is challenging because there is no existing deal precedent that would provide norms or comparisons to help guide or inform how a deal should be structured.

Several things remain unclear for studios:

How Much Should AI Giants Pay Hollywood? What Insiders Say Has Stalled Any Licensing Deals

Not Having the Right Rights

Studios own the rights to license their movies and shows for distribution as completed works. But those contractual rights probably don’t extend to use for AI training, which is not the same thing as distribution and is a completely novel concept not remotely contemplated when contracts with talent and other third parties originated and were agreed to.

Sources suggested this was a serious factor being urged by vigilant studio lawyers and legal teams and referred to by one source as “kind of the elephant in the room” — that a licensing deal is simply not contractually allowed.

“It’s one thing if it’s happening internally, it’s quite another to grant those rights to a third party, where you [as the licensor] really have to have full confidence that you have the rights you’re purporting to grant,” said Simon Pulman, partner and co-chair of the Media + Entertainment and Film, TV + Podcast Groups at Pryor Cashman LLP.

In order to license content safely to a tech company for AI training, studios would need to get rights approvals and update preexisting contracts. That goes for third parties that had licensed any number of varied elements for use on a particular production, such as an original Picasso on the wall or product placements, a subject attorney Joy Butler covered for VIP+ in “Licensing Your Movie & TV Content for AI Training.”

More consequentially, that goes for actors whose likenesses appear in the content, meaning name, image, likeness (NIL) data that would end up in the training data and therefore potentially in the model output in ways they didn’t previously agree to. Actors are generally work for hire on a given production, where their contracts give studios broad rights to use their NIL only in and in connection with that particular motion picture.

Even though the SAG-AFTRA agreement is silent (and therefore permissive) on AI training, it’s unlikely that talent contracts from past productions predating generative AI would give studios the right to allow AI training data to include an actor’s likeness.

“That idea of converting a completed work and having it contribute to derivative works and assets — there is absolutely no policy that really, truly describes a set of protections around what the actual leeway for licensing in that context is,” a source told VIP+.

Actor Likenesses in the Dataset

Studios understand that actors, and eventually the Screen Actors Guild, will want to establish consent and some form of compensation if AI data deals occur and whenever the licensed dataset included actors’ likenesses from their performances in a studio’s films and TV shows.

That scenario would likely mean each studio seeking potentially hundreds of direct opt-ins from any number of actors whose likenesses appeared in content they planned to license — at which point actor reps or estates would immediately ask for their cut.

Even if a studio did get opt-in from an actor, their rep would further want to negotiate directly with the AI company on the protective parameters of use specifically for that individual, which would hugely complicate and prolong negotiations for any potential deal.

With actors suddenly operating as de facto rights holders in a studio’s licensing deal, each actor would enumerate their own complex set of terms for studios or AI companies to follow. Some actors will also want the right to opt out entirely from having their NIL data used in training, obligating studios or AI companies to exclude or delete their likeness from the dataset, a source told VIP+.

“[An actor’s NIL] data is [then] being used in a way that goes well beyond the finished work as an encapsulation of their performance and now has derivative application. So, beyond remuneration financially, there [would have] to be protection in the license terms with the AI company for [an actor] to be able to say, ‘The following 50 things I’m not okay with,’ ” said a source.

But some studios may want to take the position that getting rights approvals from actors is moot. “I have heard there is a position by some that this is different because it’s not distribution of the completed work, it’s actually corporate intellectual property that we [studios] are choosing to license. So therefore, 100% of the revenue goes to the studio, and there is no sharing back with the profit participants of the film or TV project,” a source said.

However, if studios license content without actors’ permission and don’t devise any sort of payment structure, that could open up studios to litigation. One source described a scenario of an actor who wants to personally benefit from licensing their NIL data to an AI company, only to be in a demoted negotiating position if the company had already trained its models on the actor’s likeness in movies and shows it previously licensed from a studio. At that point, the actor could conceivably claim damages against the studio for diminishing a potential revenue opportunity by licensing.

Lawsuits might also arise if an AI model trained on an actor’s studio-licensed data is used to deepfake the actor’s likeness. If training enabled a user to prompt the model for an output they then distributed or commercialized, an actor could have a claim against not just the user, the AI company and/or the distribution platform but potentially the studio.

Sources firmly agreed that studios would and should not license the right to create digital replicas resembling any actor and should impose that restriction in any training deal with an AI company. Developers want, can and typically do program restrictions into a model to prevent it from outputting famous-person likenesses in response to user prompts.

But such technical guardrails aren’t failsafe, and no standard requirements exist for AI companies to provide them, meaning studios would have to impose them in the terms. And actor likenesses will be in the model regardless and could lead models to create new digital humans that still heavily draw from likeness traits of one or multiple actors who were in the training data.

Today, no U.S. federal law yet exists to give individual legal rights to protect their personal likeness from the distribution of unauthorized digital replicas. But multiple such laws have been proposed, and the U.S. Copyright Office’s first installment of its AI study supports one.

Preferring to Keep It In-House

Some studios see fine-tuning an AI model exclusively for internal use as a preferable alternative to licensing their IP for general model training. At least two studios are negotiating with OpenAI to license and then train Sora on the corpus of their intellectual property for internal use, said one source with knowledge of the matter.

Studios regard fine-tuning as more data-secure and their competitive advantage, allowing studios to retain their IP and any production-efficiency benefits without conferring improvements to Sora’s general model capabilities.

Fear of Enabling a Competitor

A predominant fear among studios is that, by licensing, they will be contributing to improving the tech that grows to potentially replace them in the market, which some have analogized to the early days of licensing high-value content to Netflix and other past mistakes in relationships with tech that were regretted in hindsight.

Multiple sources referring to their knowledge of studio perspectives indicated this as a key reason studios were still deliberating on how to proceed. Yet in this post-peak TV era, studio businesses are also in a market moment reminiscent of the one that drove many to license their best content to Netflix, desperately needing to demonstrate profitability.

The worry is that licensing studio library content would enable a video model to produce much more cinematic quality outputs, teaching models how to create different styles, shots and camera angles present in film and TV content that would not be as represented in videos scraped from YouTube, for example.

More consequentially, however, the likely product roadmap for video generation models like Sora is to become multimodal, meaning they might eventually output video with contextual sound.

A studio licensing its films and TV shows would be giving the AI company a dataset of the completed works, each containing rich creative components beyond video alone, such as the musical score or spoken script — all of which would end up training the model, with potentially unforeseen benefits toward multimodal capabilities.

“Just because you’re licensing it under v1 of the product doesn’t mean that it won’t evolve into something else, and presumably has data usage for those other things as well,” a source told VIP+.

Studios are paralyzed on licensing for generative AI in part by the sheer complexity of film and TV shows as collaborative works. Yet many are aware they cannot just bury their heads: AI video developers will eventually compete as video entertainment products, regardless of whether the studios license or not.

See other pieces on this topic:

Variety VIP+ Explores Gen AI From All Angles — Pick a Story

The Business of Entertainment

Hollywood's AI Dilemma: Why Studios Are Still Hesitant to License Their Content
Credit: website-files.com
Tags:
Artificial Intelligence Hollywood AI Hollywood Licensing Studios film TV content
Emily Brown
Emily Brown

Business Analyst

Analyzing the financial world one report at a time.