upper waypoint

How AI Could Ruin Documentary Film by Polluting the Truth

Save ArticleSave Article
Failed to save article

Please try again

A hand holds a tablet with an image of a woman walking through a city street at night with the Open AI logo in the background.
Archival producers, who play a crucial role in documentary filmmaking, warn that using Sora and similar generative AI programs could muddy the historical record irrevocably. The Archival Producers Alliance released a set of “best practices” to keep generative AI from destroying the integrity and authenticity of documentaries on Sept. 13, 2024. (Photo Illustration by Costfoto/NurPhoto via Getty Images)

Most of us expect documentaries to deliver facts about the past or present, even if the truth is delivered with a point of view. Most of us expect documentary filmmakers to be transparent about their source material, like videos, photos and newspapers.

Before generative AI came on the scene, it was fairly obvious when a documentary delivered a re-creation of something that happened or might have happened. Post-generative AI, nothing is necessarily obvious, and as the technology improves, it’s near-certain that fake bits in documentaries will proliferate.

Archival producers, who play a crucial role in documentary filmmaking, warn there’s a clear and present danger that the fakery will muddy the historical record irrevocably. On Friday, the Archival Producers Alliance is releasing a set of “best practices” to keep generative AI from destroying the integrity and authenticity of documentaries.

The collective of more than 400 people from 25 countries specializes in archives. Many of them help filmmakers put together stories, and some are documentary filmmakers themselves.

Alliance co-director Rachel Antell of Berkeley said she hopes the new, voluntary guidelines will get Hollywood and Silicon Valley talking in a serious way about the integrity of history that belongs to all of us.

Sponsored

“The two things we always struggle with are not enough money and not enough time, and especially not enough money. AI comes in and seemingly offers solutions to both those existential issues, but really, it doesn’t. I think the question that we always need to be asking is ‘At what cost?’” said Antell, who has worked in documentary film for about 25 years, producing work that’s run on HBO, Netflix and Hulu.

Antell and others in the Alliance say they already see documentaries using AI-generated voices of dead people, often to read something they wrote but never recorded. Consider the 2021 documentary Roadrunner, featuring an AI-generated Anthony Bourdain reading his email without disclosing it to viewers. “We can have a documentary-ethics panel about it later,” the filmmaker Morgan Neville told the New Yorker, prompting Bourdain’s second wife, Ottavia Busia-Bourdain, to tweet at the time, “I certainly was NOT the one who said Tony would have been cool with that.”

Documentaries have long used recreations or animations to cinematically evoke historical people, places or things. However, a number of people in the industry cringed when OpenAI introduced its video generation model Sora with clips, including “Historical Footage” of a California Gold Rush Era town.

It looks like a blurry, color drone shot of a Hollywood set for a Western town, inexplicably free of women and children, or any non-white people, sitting alongside a brook free of garbage and gold mining pollution common in the mid-19th century. Such a clip might add a vaguely evocative touch to a documentary, but this AI-generated vision veers quite far from what an actual California Gold Rush town would look like.

“Some things that are legal are not necessarily ethical,” said Alliance Co-Director Stephanie Jenkins, a documentary and archival producer in Hudson, New York, who’s primarily worked on historical documentaries for PBS, as well as the New York Times series Op-Docs and WNYC’s Radiolab. “Some things that are ethical aren’t necessarily legal. Part of our work is educating documentary filmmakers about where those lines are currently and how they may change.”

The guidelines are not concerned with minor alterations, like retouching, restoration, or playing with the resolution of individual items. Rather, the Alliance is concerned with alterations that could “mislead the audience” and introduce noise into the historical record.

The Alliance is also concerned with the potential for a massive data grab, similar to what’s happening to the field of journalism.

“Most of what archives have is not digitally accessible,” Antell said, noting that companies developing the large language models are “going to the archives and saying, ‘We’ll digitize your material for free in exchange for being able to scrape it.’ It’s an immediate gain for the archives, which are mainly small and struggling for their survival. But it’s very shortsighted. It’s a Faustian bargain. Their metadata, that’s much more valuable than the actual material, and so, their long-term ability to survive is really compromised.”

Related Coverage

Others are less concerned about generative AI’s incursion into archival materials. “I’ve found that sometimes, the gatekeepers of archival collections protect their domain of archival material so tightly that they forget that the goal is the public see the footage or images, that it is re-surfaced as often as possible and discussed; that it is accessible,” Oakland producer Charlotte Buchen Khadra said. “Perhaps AI can help make various archives easier to search, for example. I’m hopeful.”

The Alliance is not calling for documentary filmmakers to stop using generative AI altogether. They say there are plenty of examples of beneficial uses, like in the HBO doc Welcome to Chechnya, which overlaid the faces of volunteers on top of people facing anti-LGBTQ persecution in Russia to protect them from identification.

Primarily, the Alliance guidelines beg for transparency. That message is being received with open arms by many old-timers in the field.

“My students have said it is a wonderfully economical way to deal with the expense of animation, and assuming that the prompts are accurate, which is a big if, it can be very helpful,” wrote June Cross, at Columbia University’s Graduate School of Journalism. “What troubles me about it is that people can take orphan photos or even copyrighted photos and make animation out of them, thereby stealing the work of the original creator and also making people think that this is an accurate representation…I know that a lot of people don’t give a darn about accuracy these days, but some of us still care about whether the picture shows a ‘58 or ‘64 Cadillac.”

“Documentary filmmakers don’t follow a single code of ethics,” wrote Stacey Woelfel, director of the Documentary Journalism Center at the Missouri School of Journalism. “So there’s no set of commandments to be handed down to tell filmmakers what they can and cannot do.”

Sponsored

Given that, the Alliance plans to host a series of talks, panels and workshops in the coming months to spread the word about its findings and concerns. The group also plans to develop a tool kit to provide filmmakers with things like AI crediting guidelines and a cue sheet, insider lingo for a detailed outline of everything in a program, including, in this instance, software versions, prompts used, and the like. Otherwise, Antell and Jenkins warn that public trust in documentaries will be lost forever.

lower waypoint
next waypoint