TED 2018: Fake Obama video creator defends invention

INSUBCONTINENT EXCLUSIVE:
Image copyrightBret Hartman/TEDImage caption None of these Obama videos is real said Supasorn Suwajanakorn
A researcher who created a fake video of President Obama has defended his invention at the latest TED talks.The clip shows a
computer-generated version of the former US leader mapped to fit an audio recording
Experts have warned the tech involved could spark a "political crisis".Dr Supasorn Suwajanakorn acknowledged that there was a "potential for
misuse".But, at the Vancouver event, he added the tech could be a force for good.The computer engineer is now employed by Google's Brain
division
He is also working on a tool to detect fake videos and photos on behalf of the AI Foundation.Damage riskDr Suwajanakon, along with
colleagues Steven Seitz and Ira Kemelmacher-Shlizerman from the University of Washington, released a paper in July 2017 describing how they
created the fake Obama.Media captionThe tool can edit videos of people speaking and make them say something they have notThey developed an
algorithm that took audio and transposed it on to a 3D model of the president's face.The task was completed by a neural network, using 14
hours of Obama speeches and layering that data on top of a basic mouth shape.Dr Suwajanakorn acknowledged that "fake videos can do a lot of
damage" and needed an ethical framework."The reaction to our work was quite mixed
People, such as graphic designers, thought it was a great tool
But it was also very scary for other people," he told the BBC.Political crisisIt could offer history students the chance to meet and
interview Holocaust victims, he said
Another example would be to let people create avatars of dead relatives.Experts remain concerned that the technology could create new types
of propaganda and false reports."Fake news tends to spread faster than real news as it is both novel and confirms existing biases," said Dr
Bernie Hogan, a senior research fellow at the Oxford Internet Institute."Seeing someone make fake news with real voices and faces, as seen
in the recent issue about deepfakes, will likely lead to a political crisis with associated calls to regulate the technology."Deepfakes
refers to the recent controversy over an easy-to-use software tool that scans photographs and then uses them to substitute one person's
features with another
It has been used to create hundreds of pornographic video clips featuring celebrities' faces.Dr Suwajanakorn said that while fake videos
were a new phenomenon, it was relatively easy to detect forgeries."Fake videos are easier to verify that fake photos because it is hard to
make all the frames in video perfect," he told the BBC."Teeth and tongues are hard to model and could take another decade," he added.The
researcher also questioned whether it made sense for fake news creators to make complex videos "when they can just write fake stories"