Automating Entertainment: Writers Demand that Studios Not Use AI

When the Writers Guild of America (WGA) came with their list of demands in the strike that has already grinded production on many shows to a halt, chief among them was that the studios agree not to use artificial intelligence to write scripts. Specifically, the Guild had two asks: First, they said that “literary material,” including screenplays and outlines, must be generated by a person and not an AI; Second, they insisted that “source material” not be AI-generated.

The Alliance of Motion Picture and Television Producers (AMPTP), which represents the studios, rejected this proposal. They countered that they would be open to holding annual meetings to discuss advancements in technology. Alarm bells sounded as the WGA saw an existential threat to their survival and that Hollywood was already planning for it.

Writers are often paid at a far lower rate to adapt “source material” such as a comic book or a novel into a screenplay than they are paid to generate original literary material. By using AI tools to generate an outline or first draft of an original story and then enlisting a human to “adapt” it into screenplay, production studios potentially stand to save significantly.

Many industries have embraced the workflow of an AI-generated “first draft” that the human then punches up. And the WGA has said that its writers’ using AI as a tool is acceptable: There would essentially be a robot in the writers’ room with writers supplementing their craft with AI-generated copy, but without AI wholly usurping their jobs.

Everyone appears in agreement that AI could never write the next season of White Lotus or Succession, but lower brow shows could easily be AI aped. Law and Order, for instance, is an often cited example. Not just because it’s formulaic but because AIs are trained on massive data sets of copyrighted content and there are 20 seasons of Law and Order for the AI to ingest. And as AI technology gets more advanced who knows what it could do? Chat GPT was initially released last November and as of writing we’re on GPT-4, a far more powerful version of a platform that is advancing exponentially.

The studios’ push for the expanded use of AI is not without its own risks. The Copyright Office has equivocated somewhat in its determination that AI-generated art is not protectable. In a recent Statement of Policy, the Office said that copyright will only protect aspects of the work that were judged to have been made by the authoring human, resulting in partial protections of AI-generated works. So, the better the AI gets—the more it contributes to cutting out the human writer—the weaker the copyright protection for the studios/networks.

Whether or not AI works infringe the copyrights on the original works is an issue that is currently being litigated in a pair of lawsuits against Stability AI, the startup that created Stable Diffusion (an AI tool with the impressive ability to turn text into images in what some have dubbed the most massive art heist in history). Some have questioned whether the humans who wrote the original episodes would get compensated, and the answer is maybe not. In most cases the scripts were likely works for hire, owned by the studios.

If the studios own the underlying scripts, what happens to the original content if the studios take copyrighted content and put it through a machine that turns out uncopyrightable content? Can you DMCA or sue someone who copies that? As of this writing, there are no clear answers to these questions.

There are legal questions and deeper philosophical questions about making art. As the AI improves and humans become more cyborgian, does the art become indistinguishable? Prolific users of Twitter say they think their thoughts in 280 characters. Perhaps our readers can relate to thinking of their time in 6 minute increments, or .1’s of an hour. Further, perhaps our readers can relate to their industry being threatened by automation. According to a recent report from Goldman Sachs, generative artificial intelligence is putting 44% of legal jobs at risk.

© Copyright 2023 Squire Patton Boggs (US) LLP

For more Employment Legal News, click here to visit the National Law Review.