It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

"Deepfake" videos now easily accessible and look VERY real. Seeing is not believing.

page: 1
16

log in

join
share:

posted on Oct, 27 2018 @ 02:22 PM
link   
"Easily" accessible might be a bit of an exaggeration but the technology is definitely more readily available than it was just a few years ago.

Just as concerning is that these videos, which can be used for everything from false confessions to complete impersonations, are becoming more and more difficult to detect.

As this short documentary points out, Hollywood has had the ability to change "reality" for a very long time. And now that technology is available to the masses.

And lets be honest. We know full well that Hollywood has in all likelihood helped to create fake events or re-write history, change facts etc.

And now, so do government departments, agencies, corporations and private individuals...


edit on 27-10-2018 by gladtobehere because: typo




posted on Oct, 27 2018 @ 02:38 PM
link   
a reply to: gladtobehere

The unreal, begetting yet another flavour of unreal.



posted on Oct, 27 2018 @ 02:53 PM
link   
a reply to: gladtobehere

I guess the next generation politics will be dirtier than ever, and we are going to be played and conned in ways where we are never able to even know that.



posted on Oct, 27 2018 @ 02:56 PM
link   
a reply to: gladtobehere

Yup, the future is scary.

Let’s just get used to it already...




posted on Oct, 27 2018 @ 03:03 PM
link   
Freejack is coming.



posted on Oct, 27 2018 @ 03:47 PM
link   
Yes but the thing is all of the deepfake videos are still obviously fake. They don't cross the uncanny valley yet.

Maybe they will one day. But that day is not yet today.



posted on Oct, 27 2018 @ 04:04 PM
link   
a reply to: gladtobehere

I think this will ultimately be a good thing. It may take time and conflict to realize this, but we live and breathe MSM. If one day we realized that our eyes and ears can no longer trust what the media is presenting to us, then maybe we'll take it upon ourselves to seek a higher truth elsewhere.



posted on Oct, 27 2018 @ 06:42 PM
link   
a reply to: gladtobehere

"deepfake"... pun intended?



posted on Oct, 27 2018 @ 08:21 PM
link   
I'm going to need some proof of these claims. Specifically proof about the Emma Watson and Gadot stuff, I don't believe it.



posted on Oct, 27 2018 @ 08:36 PM
link   
its almost to the point where it is undetectable to the human eye. That could get out of hand.

it's still pretty easy to tell the difference between real and fake from what they showed us in that video though.



posted on Oct, 27 2018 @ 11:24 PM
link   
There has to be some monetary goal to make it wide spread and keep ahead of detection methods. I dont think celebrity p0rn can cover the development costs. Universities have to get funding too. It would appear to be a dead end....

But then there is the intelligence community. They would surely want to keep ahead of their foes with the latest tech in developement and detection. Sadly they will probably be funding more of these projects through universities.

That will raise the likelyhood of more mainstream applications being discovered that would help make it a monetize industry.

It will likely be readily available app in 10-15 years. Only God knows why...



posted on Oct, 28 2018 @ 07:07 AM
link   

originally posted by: booyakasha
it's still pretty easy to tell the difference between real and fake from what they showed us in that video though.

From what I understand of it, it depends on how much information we give the software, so if we give it 15 minutes of videos of someone it will be able to create a fake video, but if we give it 15 hours it will be able to create a much better video, as it will have much more data to learn how to do it.

It's mostly a matter of the time you are willing to commit to that work, if you want to get a convincing result you will need to spend more time doing it.



posted on Oct, 28 2018 @ 07:44 AM
link   
How soon b4 AI starts to create them all on its own, aka, no human involved in the "deepfake" video release ? We haven't even mastered being able to pick out the fakes and the fakes are already the ones creating them type scenario.

leolady



posted on Oct, 28 2018 @ 08:22 AM
link   

originally posted by: leolady

How soon b4 AI starts to create them all on its own, aka, no human involved in the "deepfake" video release ? We haven't even mastered being able to pick out the fakes and the fakes are already the ones creating them type scenario.

leolady


that is an interesting threat, but I'm more worried about someone, or something asking AI what's wrong with the Earth. The obvious answer is "people", so the fix is equally as obvious.



posted on Oct, 28 2018 @ 09:12 AM
link   

originally posted by: leolady
How soon b4 AI starts to create them all on its own, aka, no human involved in the "deepfake" video release ?

I think it will take some time.

Most of the things that are presented today as AI are not really capable of intelligent actions. In the case of deepfakes, what the system does is to learn how the person in those images moves, so it can take another person's face and move it in the same way. That's why the more images we give it the better the final result is.

For something like what you say to happen the system would need to want to do it, look for the data needed, work with it and then publish it. Of all those four actions it can only make one at the moment, work with the data. "Wanting to do it on their own" is something I don't see AI doing in the near future.



posted on Oct, 28 2018 @ 10:15 AM
link   
a reply to: gladtobehere

Next step may be to build a virtual universe, populate it with these created facsimiles and see what they do next.



posted on Oct, 28 2018 @ 10:33 AM
link   
His nuclear war scenario isn't realistic. NoKo doesn't launch just because they see a video on Twitter. If they don't detect anything coming, they don't launch.



posted on Oct, 29 2018 @ 05:19 AM
link   
You still need some expensive computer hardware and a lot of time to do more than a 30 second clip. So it making completely realistic high quality videos is still out of the reach of most.

Not to mention this is very old news and you're a bit late to the party OP.
The software has been out in the wild for years now.

Who needs deep fakes any ways. People including the MSM already have access to the basic video editing functions that have always existed that allow you to trim down any video you want to change the context of what the speaker is saying and most will always only hear what they want to any way.



new topics

top topics



 
16

log in

join