News at the Intersection of Synthetic Media & Moving Image Archives
Award-winning British artist Gillian Wearing created a deep fake video of herself as part of her exhibition at the Cincinnati art Museum.
Experts fear that in the wrong hands, deepfakes could become the next frontier in fake news – and spark very real consequences.
To help detect a DeepFake video, look at the eyes. Siwei Lyu discusses the battle against DeepFakes. Lyu is an associate professor of computer science at Albany, part of the State University of New York System. A transcript of this podcast can be found here.
Generative adversarial networks, or GANs, are fueling creativity—and controversy. Here’s how they work. By Karen Hao
This panel identifies guidelines tech companies can follow to limit their negative use and offer views on how governments should react to deep fakes, if at all. With speakers Robert Chesney, James Baker Chair in Law, University of Texas at Austin, Aviv Ovadya, Chief Technologist, Center for Social Media Responsibility, University of Michigan, and Laura M. Rosenberger, Senior Fellow and Director, Alliance for Securing Democracy, German Marshall Fund of the United States
Advances in digital imagery could deepen the fake-news crisis—or help us get out of it. Interview with Hany Farid
After a public outcry over privacy and their inability — or unwillingness — to address misleading content, Facebook, Twitter, and other social media platforms finally appear to be making a real effort to take on fake news. But manipulative posts from perpetrators in Russia or elsewhere may soon be the least of our problems. What looms ahead won’t just impact our elections. It will impact our ability to trust just about anything we see and hear.
“It’s something archives have dealt with for centuries,” Yvonne Ng, a senior archivist at WITNESS, a nonprofit that focuses on collecting video evidence of human rights abuses, told Gizmodo. “The deepfake is a new spin on this process, but archives have always had to deal with forgeries or fakes or plagiarism—and even unintended damage and deterioration—and then having to determine the authenticity of objects with all of those considerations in mind.”
"Archival methods are not primarily about tools and the tech,” Ng noted. “Archival methods have always been more about having controlled and consistent policies and rules.” She said that descriptions of information and its metadata are a major part of archival work: In other words, documenting the context of the content."
"Ultimately, the greatest protection archives offer against the distortion of history may be their careful documentation of previous errors. By supporting archiving projects, we not only ensure that the past is preserved accurately, but create a guide for the future by chronicling the long relationship between media and deception.” - Author Melanie Ehrenkranz
Artificial intelligence is emerging as the next frontier in fake news — and it just might make you second-guess everything you see.
Once filmmakers have no need of human actors, expect more sequels, more lawsuits—and fewer opportunities for newcomers
Whenever a big debate comes up over an individual's likeness rights, one can expect Hollywood studios and the industry's performers to offer different visions about what's at stake — free expression or unfettered exploitation. Thus, it's no surprise that a bill currently before the New York legislature establishing a right of publicity for living and deceased individuals is drawing praise and fire from the usual suspects.
"what makes a deepfake in the first place?"
"the question raises a number of interesting issues: not only our difficulty in defining deepfakes, but the problems that could arise if the term is applied vaguely in the future. Could “deepfake” become the next “fake news,” for example; a phrase that once described a distinct phenomenon (people publishing fabricated news stories on social media for profit), but that has now been co-opted to discredit legitimate reporting.”
The digital manipulation of video may make the current era of “fake news” seem quaint.
SAG-AFTRA says it’s “fighting back” against the dangers posed by new face-swapping technologies that have been used to digitally superimpose the faces of its members onto the bodies of porn stars. In recent months, the technology – known as “deepfaking” – has hijacked the likenesses of several famous actresses and singers to make it appear that they were performing in pornographic films.
Thanks to AI-assisted software that lets you put one person's face on another's body, you could put yourself in a porn...or anyone else, for that matter. That's where things get complicated.