The looming threat of AI to Hollywood, and why it should matter to you
Artificial intelligence could be the most important part of a writers strike, for reasons bigger than show business.
Artificial intelligence could be the most important part of a writers strike, for reasons bigger than show business.
A Vox article featuring screenwriter John August on the question (read: concern) about generative artificial intelligence creeping into the domain of screenwriters. Some excerpts:
Not since the advent of streaming has a technology stood to change the landscape of Hollywood so drastically. “About a year ago, I went to the Guild because I had questions about AI,” John August told me. August also knows the territory intimately. He’s a widely produced screenwriter (Go, Charlie’s Angels, Big Fish, and lots more), the co-host of the hugely popular Scriptnotes podcast, and a former board member of the WGA. His concerns are part of the reason AI is one of the issues the WGA is working to address in its negotiations with Hollywood’s studios. Friends had shown him a rudimentary text generator that they said could help write a script. “Oh, that’s interesting,” he remembers thinking. “But also potentially really problematic, because it raises a host of questions — like, who really wrote this thing?”
Problematic technology has always been a sticking point in writers’ contracts. Back in 2007, the last time there was a strike, residuals from streaming services was a major area of discussion. A future in which most people would watch TV by streaming it from the internet, and in which half of all series writers would be working on projects that would never appear on broadcast TV at all, was unthinkable. That’s why one of the major disputes had to do with whether writers would get residuals, a sizable source of steady income, when their work streamed. The studios said no; the writers said yes.
“To me, this seems like a similar level of shift,” August says.
— —
Some of Hollywood’s power players are clearly far from ready to face the reality of AI and its cost-cutting (read: job-cutting) potential. The shift toward AI use has been evident for years. Consider the use of AI engines to make decisions about greenlighting projects, or the generation of a second Will Smith for the 2019 action movie Gemini Man, in which Smith co-starred opposite a fully computer-generated replica of his younger self — something AI makes very easy. Or consider Avengers: Engdame co-directors Joe and Anthony Russo’s ventures into filmmaking AI, which they believe will be capable of generating scarily narcissistic-sounding entertainment — you get to star in a movie with Marilyn Monroe, with a couple of button clicks — inside of a few years. (On that point, they’re not wrong.)
The WGA, on the other hand, is aware of the issue, and included it in their pattern of demands ahead of the overwhelming strike authorization vote. At the moment, the WGA’s contract (called the MBA, or Minimum Basic Agreement) only defines a “writer” as a “person,” which August quipped is “still, in 2023, a human being.” But those definitions could change, and the tech is evolving fast.
“So we felt it’s important to get two things defined in the contract more clearly,” August told me. The WGA has two main stipulations. First, the guild wants to make sure that “literary material” — the MBA term for screenplays, teleplays, outlines, treatments, and other things that people write — can’t be generated by an AI. In other words, ChatGPT and its cousins can’t be credited with writing a screenplay. If a movie made by a studio that has an agreement with the WGA has a writing credit — and that’s over 350 of America’s major studios and production companies — then the writer needs to be a person.
“Based on what we’re aiming for in this contract, there couldn’t be a movie that was released by a company that we work with that had no writer,” says August.
Second, the WGA says it’s imperative that “source material” can’t be something generated by an AI, either. This is especially important because studios frequently hire writers to adapt source material (like a novel, an article, or other IP) into new work to be produced as TV or films. However, the payment terms, particularly residual payouts, are different for an adaptation than for “literary material.” It’s very easy to imagine a situation in which a studio uses AI to generate ideas or drafts, claims those ideas are “source material,” and hires a writer to polish it up for a lower rate. “We believe that is not source material, any more than a Wikipedia article is source material,” says August. “That’s the crux of what we’re negotiating.”
Despite how poorly ChatGPT fared when I instructed it to write a scene in the vein of an Aaron Sorkin-scripted moment from West Wing, with the say technological advances seem to move at light speed, we could very well be at a point in five years or so where screenwriters are hanging on by their fingernails as studios and networks rely on AI to generate scripts.
Both scary and depressing.
For the rest of the Vox article, go here.