It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
An aspect of artificial intelligence that’s sometimes overlooked is just how good it is at creating fake audio and video that’s difficult to distinguish from reality. The advent of Photoshop got us doubting our eyes, but what happens when we can’t rely on our other senses? The latest example of AI’s audiovisual magic comes from the University of Washington, where researchers have created a new tool that takes audio files, converts them into realistic mouth movements, and then grafts those movements onto existing video. The end-result is a video of someone saying something they didn’t. (Not at the time, anyway.) It’s a confusing process to understand by just reading about it, so take a look at the video below:
Of course, there’s also the worry that tools like this can and will be used to generate misleading video footage — the sort of stuff that would give some real heft to the term “fake news.” Combine a tool like this with technology that can recreate anyone’s voice using just a few minutes of sample audio and you’d be forgiven for thinking there are scary times ahead. Similar research has been able to change someone’s facial expression in real-time; create 3D models of faces from a few photographs; and more. The team from the University of Washington is understandably keen to distance themselves from these sorts of uses, and make it clear they only trained their neural nets on Obama’s voice and video. (“You can’t just take anyone’s voice and turn it into an Obama video,” said professor Steve Seitz in a press release. “We very consciously decided against going down the path of putting other people’s words into someone’s mouth.”) But in theory, this tech could be used to map anyone’s voice onto anyone’s face, will everyone be so scrupulous if the technology becomes widespread?
originally posted by: lordcomac
This gets posted about once a month.
Scary stuff.