Follow KDKA-TV: Facebook | Twitter
PITTSBURGH (KDKA) – You can’t believe everything you see anymore.
New technology is making it easier to manipulate video and sound and some of it’s being developed right here in Pittsburgh.
While the technology has some very legitimate uses, there’s also a real concern about it.
We already know there are “fake news” stories online, but “fake video” could add a whole new dimension.
Two years ago, moviemakers wowed us with a digitally created 19-year-old Carrie Fisher in “Rogue One: A Star Wars Story.”
But nowadays, you don’t need a million dollar film budget to create pretty realistic fakes.
Someone who calls himself Derpfakes posted altered videos from “Saturday Night Live” on YouTube.
On the left of the screen is actress Kate McKinnon as Hillary Clinton, but on the right, Clinton’s face was digitally inserted.
People can now buy software and doctor video at home to create what’s called “deep fakes.”
Some of it is harmless, like putting Nicholas Cage into movies that he wasn’t really in.
But, a video produced by BuzzFeed and filmmaker Jordan Peele shows how it could take a dangerous step.
In it, it appears that former President Barack Obama has a message: “We’re entering an era where our enemies can make it look like it look like anyone is saying anything at any point in time.”
Only later does the video reveal that it’s fake, that he never actually said those words.
Peele appears on screen at the end: “We need to be more vigilant with what we trust from the internet.”
In this case, it’s Peele’s voice impersonating Obama, but you don’t need an impersonator.
At Carnegie Mellon University, Professor Alan Black specializes in speech synthesis.
He showed KDKA-TV’s David Highfield how it works with an audio clock.
After Highfield read some lines stating various times of day so he can get a voice sample, he’s able to plug that audio into a synthesizer. The result is that he’s able to have Highfield say almost anything he types.
Highfield was astounded by the manipulation, but having done this for years, Black’s not.
He’s not just recording words and rearranging them. In fact, the synthesizer only takes the middle of words.
“These are the sounds like: ta, pa, ka,” Black said.
He calls it “Photoshop for audio,” and it has a lot uses, including if someone loses the ability to speak.
“You can take the whispered speech that comes out their mouth and convert it live into real speech,” Black said.
The late movie critic Roger Ebert had an early version of this. Using hours of his old TV show, experts were able to give him a voice that sounded like his own.
But, the same technology allows you to literally put words in someone’s mouth.
There’s another team at Carnegie Mellon working on altering video.
Using artificial intelligence, Aayush Bansal can transfer facial expressions from one face to another.
In one case, making Stephen Colbert’s face mimic John Oliver, or Obama imitate Martin Luther King Jr.
So let’s say we wanted Highfield to appear to say something her never did – perhaps something Ken Rice actually said.
KDKA-TV give him video of both Rice and Highfield delivering different news stories.
He’s able to transform the video of Highfield, having his mouth saying words he didn’t say, and his face actually took on Ken’s expressions.
The results looked a little rough, but it gets smoother with more video to sample from.
And he can do it the other way around – digitally manipulating video of Rice, so he’s saying Highfield’s words and imitating his expressions.
This particular technology can also be used to make video of one flower bloom like another.
There are legitimate uses for it and not just in video production. For instance, it may be possible to teach a robot arm to pick up a cup by mimicking video of a human doing it.
But, there’s also a downside.
“It is possible it could be used in the wrong way,” Bansal said.
He says there’s a solution, though, to even sophisticated fake videos.
“We can create a machine-learning model which can distinguish between a real image, real video from a fake or generated video,” he said.
That way, you might get blocked before you’re even able to upload a fake video to a social media site, if you don’t disclose that’s been altered.
In the meantime, be aware that even what appears to be real online, may not be.
“Even if you’re seeing it, even if you think you’re hearing it, there’s no guarantee that’s what the person originally said,” Black said.
Tech companies, even the Defense Department, are looking for ways to detect fakes.
The best advice when you see a video is look for glitches. Also, consider the source: Is this a trusted source of material?
Finally, if you’re unsure, use Google to research and see what people are saying about it.