back to

Synthetic Authenticity – AI and the Archive

This isn’t exactly where I thought my last blog post would go. I was preparing to write up a final wrap-up of the project and my experience; however, earlier this week, I saw a video that shook me to my core and made me question reality. The video in question, from popular TikTok user Alex Pearlman, discusses the newly updated Sora app, a text-to-video model developed by OpenAI, the same company that created ChatGPT.

The model generates short video clips based on user prompts. However, in Pearlman’s video, he points out how, through this new update, Sora can now create content that replicates the likeness of anyone. Specifically, he points to examples of Michael Jackson shilling products on TikTok live, Tupac Shakur meeting Mr. Rogers, and even President Ronald Reagan opening pokémon cards.

The latter two are what gave me such pause, as these clips seemingly replicated the same aesthetics of eras they were depicting- such as the blurriness and CRT lines present in 1980s broadcasts. Pearlman even comments on this, saying, “We used to be able to trust old footage… because it was old! Who would change that?”

Why would we want to change that? It’s not the first time we’ve been able to alter photos and videos. Programs such as Photoshop have long allowed us to manipulate reality. While initially able to deceive the everyday viewer, Photoshop edits tend to leave traces of manipulation, such as visible noise and discoloration, and are now more easily noticed by the trained eye.

What feels different now is both the seamlessness and effortlessness of these videos. These new AI-powered models don’t just edit an image; they create video out of nothing while simultaneously simulating the time and texture of entire eras. They recreate film grain, magnetic distortion, and broadcast fuzz so precisely that the illusion becomes indistinguishable from the real thing.

When a machine can fabricate the past this convincingly, what happens to our sense of historical truth? If the aesthetic markers that once grounded authenticity can be artificially generated, then nostalgia itself becomes a tool of deception.

AI still has it’s limits. Currently, Sora has content restrictions that explicitly target the likeness of celebrities and other copyrighted properties. However, its own detection filters are not precise- as you can tell from the types of images and videos attached to this post. Additionally, this newest model allows individuals to “cameo” in other users’ videos and have their likeness used freely, leading to controversies such as the infamous internet personality Jake Paul cameoing in videos where he is depicted “live streaming the Hiroshima bombings.”

While these videos still have tells, such as artifacting and visual glitches, the overall result is very convincing upon first glance. Soon, even those subtle “tells” may be gone. When that happens, we’ll no longer question whether something looks real, but rather whether reality itself still has a visual language we can recognize. As someone who has spent years working with archives and old media, this prospect feels deeply personal. The past, once grounded in tangible evidence, now risks becoming infinitely editable. Maybe that’s why these AI-generated “vintage” clips disturb me so much: they don’t just rewrite history—they mimic the emotional residue of remembering it.

Leave a Reply

Your email address will not be published. Required fields are marked *