A furious political leader shouting a message of hatred to a worship audience. A child crying on the massacre of his family. Men emaciated in prison uniforms, hungry on the verge of death because of their identity. When you read each sentence, specific images probably appear in your mind, seized in your memory and our collective consciousness through documentaries and manuals, media and museum visits.
We understand the importance of important historical images like these – images we have to learn to go ahead – largely because they have captured something true in the world when we were not there to see it with our own eyes.
As archive producers for documentary films and co -directors of the Archival Producers Alliance, we are deeply concerned about what could happen when we can no longer believe that such images reflect reality. And we are not the only ones: before The Oscars of this yearVariety reported that the Cinematographic Academy is considering demand suitors To disclose the use of generative AI.
Although such a disclosure can be important for feature films, it is clearly crucial for documentaries. In the spring of 2023, we started to see synthetic images and a sound used in historical documentaries on which we worked. Without any standard in place for transparency, we fear that this real and unreal merchant can compromise the non -fictional gender and the essential role that he plays in our shared history.
In February 2024, Openai previewed his new video text platform, Sora, with a clip called “Historical images of California during the gold rush.“The video was convincing: a fluid flow filled with the promise of wealth. A blue sky and hilly hills. A flourishing city. Men on horseback. It looked like a Westerner where the good guy wins and sinks to sunset. He look authentic,, But it was wrong.
Openai presented “Historical images of California during the gold rush” to demonstrate how Sora, officially published in December 2024, creates videos based on user prompts using AI which “understands and simulates reality”. But this clip is not reality. It is a random mix of imaging both real and imagined by Hollywood, as well as the historical prejudices of the industry and the archives. Sora, like other generative AI programs such as Runway and Luma Dream Machine, tears up Internet content and other digital materials. Consequently, these platforms simply recycle the limits of the online media, and without a doubt amplifying biases. However, looking at him, we understand how an audience could be fooled. Cinema is powerful this way.
Some in the world of cinema have encountered the arrival of generative AI tools with open arms. We and others see it as something deeply disturbing on the horizon. If our faith in the veracity of the visuals is broken, powerful and important films could lose their assertion on the truth, even if they do not use material generated by AI.
Transparency, something that looks like the labeling of foods that informs consumers of what’s going on in the things they eat, could be a small step forward. But no regulation of the disclosure of AI seems to be above the next hill, coming to save us.
IA generating companies promise a world where anyone can create audiovisual equipment. This is deeply worrying when applied to representations of history. The proliferation of synthetic images makes the work of documentaries and researchers – protect the integrity of the primary source material, dig into the archives, present history with precision – even more urgent. It is a human work that cannot be reproduced or replaced. You just have to turn to the documentary nominated for the Oscars of this year “Sugarcane” to see the power of meticulous research, precise archival images and well -reported personal narrative to exhibit hidden stories, in this case on the abuse of First Nations children in Canadian residential schools.
The speed at which new AI models are published and new content is produced makes technology impossible to ignore. Although it can be fun to use these tools to imagine and test, what results are not a real work of documentation – humans testifying. It’s just a remix.
In response, we need a robust IA media literacy for our industry and the general public. At the Archival Producers Alliance, we have published a set of guidelines – Approved by more than 50 industry organizations – for responsible use of generative AI in the documentary film, the practices that our colleagues are starting to integrate into their work. We have also appealed to case studies on the use of AI in the documentary film. Our goal is to help the film industry to ensure that documentaries will deserve this title and that the collective memory they inform will be protected.
We do not live in a classic Westerner; No one comes to save us from the threat of unregulated generative AI. We must work individually and together to preserve the integrity and various perspectives of our real history. Precise visual recordings do not only document what has happened in the past, they help us to understand it, to learn its details and – perhaps especially at this historic moment – believe he.
When we can no longer attend precisely the ups and downs of what preceded, the future that we share could be a little more than a random remix.
Rachel Antell, Stephanie Jenkins and Jennifer Petrucelli are co -director of the Archival Producers Alliance.