OpenAI’s new video app, Sora, is raising alarm for making hyperrealistic AI videos of deceased public figures. Users have generated clips showing Karl Marx, Martin Luther King Jr., and Princess Diana in surreal and offensive situations. The app launched in October in the US and Canada by invitation and reached one million downloads in just five days.
Sora lets users type a prompt and get a 10-second video in minutes. Unlike other low-quality AI clips online, Sora produces high-quality videos. Users can share them on a TikTok-style feed or export them elsewhere. The app allows depictions of celebrities, politicians, and famous people if they are deceased. Living people must give consent, while historical figures are exempt.
The main feed is full of bizarre content. Users have shown Adolf Hitler in a shampoo ad, Queen Elizabeth II falling off a pub table, and MLK Jr. joking at a gas station. Families of those depicted have expressed distress. Ilyasah Shabazz, daughter of Malcolm X, called the videos crude and disrespectful. OpenAI paused Martin Luther King Jr. clips after requests from his estate while working to strengthen safety controls.
Other families have raised similar concerns. Zelda Williams, daughter of Robin Williams, asked users to stop making AI videos of her father. She called them disrespectful and against his wishes. Kelly Carlin, daughter of George Carlin, said the videos were overwhelming and depressing. Clips also feature Stephen Hawking, Kobe Bryant, and Amy Winehouse, often in shocking or disturbing scenarios.
Experts warn that AI content like this can distort history and the legacy of public figures. Henry Ajder, a generative AI researcher, said Sora could change how people remember the dead. The app’s algorithm often favors shock value, producing grotesque or offensive content.
Legal experts say AI depictions of the deceased remain a grey area. Living people have protection under US libel and publicity laws. Most states do not give the same rights to the dead. Only New York, California, and Tennessee have postmortem publicity rights. Estates may find it difficult to hold AI companies accountable under current law, including Section 230 protections.
Some suggest OpenAI may be testing legal limits. Allowing users to depict the dead lets the company explore what content is allowed while minimizing liability. Creating videos purely for entertainment, with watermarks and no commercial intent, may be legal. But monetized AI content could expose users and OpenAI to lawsuits if estates profit from the likenesses.
In response to backlash, OpenAI said families of recently deceased public figures can request that their likenesses be blocked. The company has not defined “recently deceased” or explained how these requests will work. OpenAI also moved to an opt-in system for copyright holders after some content raised infringement concerns.
Experts expect ongoing legal battles as courts clarify AI liability. Generative AI researcher Bo Bergstedt described the situation as a “Whac-A-Mole” problem, with rules changing as issues arise. The Sora controversy raises questions about who controls one’s image and legacy in the AI era. Henry Ajder warned it is troubling if people accept that anyone could use their likeness in hyperrealistic AI content without permission.
As Sora grows, families, legal experts, and the public are watching closely. The app shows the tension between new technology, entertainment, and respect for personal and historical legacies in AI-generated media.