It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Morality of Deep Fakes

page: 1
5

log in

join
share:

posted on Feb, 9 2018 @ 02:23 AM
link   
Recently some programmers used a deep learning technique to swap out the faces of people in full HD video. They even created an app so that anyone could do this themselves. Predictably, the first thing people did with it was make fake celeb porn videos. Now there's a large crack down happening on almost all adult video websites and they are banning these fake videos under the premise they represent "involuntary pornography". This seems like a bit of a stretch imo, because they've never had a problem with fake pictures, but when those pictures begin moving suddenly it becomes a bit too much for people.

I find this to be a rather good example of why technology will eventually become so advanced that we can no longer trust any videos or pictures. There's really nothing that can be done to prevent it, and I find it quite silly that adult websites are trying to ban these fake videos. They are still going to be shared and there will be websites dedicated to sharing these videos, especially if they aren't allowed to share them on the popular adult websites. If they want to argue these videos are some how against the law, then they also need to argue that fake pictures are against the law, and that seems a bit dumb to me.



posted on Feb, 9 2018 @ 02:57 AM
link   
a reply to: ChaoticOrder

citation reequired



posted on Feb, 9 2018 @ 03:01 AM
link   
a reply to: ChaoticOrder

I would guess that the adult sites want to avoid getting tied up in litigation with the celebrities lawyers. Better to get out in front of it and say at least they tried.



posted on Feb, 9 2018 @ 03:03 AM
link   
a reply to: ChaoticOrder

I looked high and low for something that did anything remotely like they showed at that TED talk (I think it was?) where they demoed an actor with Obamas face on it, but the company or university that is making it even said that it was a proof of concept, and not an actual piece of software available to people?

Have I missed something? An App can do it? I'd love to see it, I could retire my atrocious AE skillz (or lackthereof) lol


--ETA Oh dear, I did miss it... have not even heard of it. I've heard of the machine learning but had no idea it could now do this.
edit on 9-2-2018 by badw0lf because: (no reason given)



posted on Feb, 9 2018 @ 03:05 AM
link   

originally posted by: ignorant_ape
a reply to: ChaoticOrder

citation reequired

Lol... well the Deep Fake community was mostly on Reddit, but it got banned yesterday.



posted on Feb, 9 2018 @ 03:09 AM
link   
A video on the subject:


And the FakeApp is still available on Reddit: Fake App



posted on Feb, 9 2018 @ 03:12 AM
link   
a reply to: watchitburn

Yeah I'd say that's probably the main reason, although from a legal perspective it seems kind of hard to argue how a fake video is much different from a fake picture. A video is really just a series if images, the way the algorithm works is to go through each frame in the video and swap out the faces.



posted on Feb, 9 2018 @ 03:44 AM
link   
Stephen King was way ahead of the curve with "The Running Man."



posted on Feb, 9 2018 @ 04:38 AM
link   
a reply to: ChaoticOrder

Aside from all the porn, a lot of people are splicing Nicholas Cage into every movie they can get their hands on.

The internet is wierd.....



posted on Feb, 9 2018 @ 05:02 AM
link   
a reply to: ChaoticOrder

Video can no longer stand as validity for anything... is what this means...

If we can not decipher which videos are real (the original source) and which are fake.

Can they program something on the "back end" of the video to tag it once it has been changed ?

This might ensure that folks will have a way to know when the video has been changed or manipulated to determine its authenticity.

If they can't or do not do this... then video can never again be used as evidence in claims of any sort.


leolady



posted on Feb, 9 2018 @ 05:07 AM
link   
a reply to: ChaoticOrder



If they want to argue these videos are some how against the law, then they also need to argue that fake pictures are against the law, and that seems a bit dumb to me.


If I remember right, a few years ago someone made a song that sounded like a famous singer - The singer claimed
copyright infringement and won the case.

Apparently it has already been ruled that you have the right to be you and own your own image - If these fake pictures
are being used for any profit motive whatsoever - the entertainer could sue - And if it was in porn they could sue
for defamation of character.

That is probably why the adult sites bared the pictures.
edit on 9-2-2018 by AlienView because: (no reason given)



posted on Feb, 9 2018 @ 05:25 AM
link   
a reply to: leolady


If they can't or do not do this... then video can never again be used as evidence in claims of any sort.

This is the largest implication of this technology imo, anyone can now be framed for anything. Even with tags for detecting a modified video, hackers would easily remove those tags and make the video appear to be unaltered.

The good news is that these algorithms aren't yet perfect, and with some very simple analysis it's easy to tell whether or not a face has been swapped. Even when these algorithms become near-pixel-perfect, there will be slight imperfections when examining each frame one by one and using certain statistical methods to detect imperfections.

However if someone really wanted to frame another person using this technology, they could use this technology as a starting point and then manually go through each frame and remove any issues. It would still be extremely hard to create something that passes all tests but it's conceivably possible, especially with lower resolution videos.
edit on 9/2/2018 by ChaoticOrder because: (no reason given)



posted on Feb, 9 2018 @ 05:37 AM
link   
a reply to: AlienView


If these fake pictures are being used for any profit motive whatsoever - the entertainer could sue - And if it was in porn they could sue for defamation of character.

Yes, profiting on such fakes is probably one of the core issues here. However, most of the large adult websites allow fake pictures of celebrities and they must profit on it because they have ads. Also there are many websites which are dedicated to sharing fake celeb pics and I'm fairly sure they'd have lots of adverts, yet they don't seem to have legal issues.

However I'm sure some of them have been sued for fake pictures in the past, it would be very interesting to find one of those cases and see what the outcome was. Has a celebrity ever managed to have fake pictures of themselves removed from a website? I recall reading something about this on the Deep Fake subreddit before it got banned.
edit on 9/2/2018 by ChaoticOrder because: (no reason given)



posted on Feb, 9 2018 @ 06:30 AM
link   
No one ever been to Sea World and had their picture faked riding on an Orca ?
22 years ago ?



posted on Feb, 9 2018 @ 07:20 AM
link   

originally posted by: ChaoticOrder
A video on the subject:


And the FakeApp is still available on Reddit: Fake App


Good video, though I'm one of those strange freaks who isn't into fake porn, or much real.. it's all fake, but the nicholas cage things look neat


Funny thing, I'd just started getting it.. 1.8gb? A whole 3 minutes.. lol.. I don't think my little lady will be able to run it but I'll still give it a looksee.

For all the bad this can entail, I find it fascinating. I used software recently that could take drone footage looking down, just normal footage, and you could create a 3D object of a house, or whatever, just by how the software worked. Nothing like this, but just how far 3D has come, when you once needed intricate scanners or really terrible dodgy setups involving masking out a slither of the object.

Off to get me 1 million malcolm turnball images and footage. lol..

edit on 9-2-2018 by badw0lf because: (no reason given)



posted on Feb, 9 2018 @ 07:22 AM
link   

originally posted by: Quantumgamer1776
a reply to: ChaoticOrder

Aside from all the porn, a lot of people are splicing Nicholas Cage into every movie they can get their hands on.

The internet is wierd.....


The John Travolta thing had run it's course, you see.




posted on Feb, 9 2018 @ 07:37 AM
link   

originally posted by: ChaoticOrder
a reply to: leolady

The good news is that these algorithms aren't yet perfect, and with some very simple analysis it's easy to tell whether or not a face has been swapped.


That was what my initial reply was about too. I mean, we've come a long way from publicly being able to do this, with Google deepmind,



To this,



Which was not available to the public last I checked... but well, on the internet no one can hear your watch tick... Literally. O.o



posted on Feb, 9 2018 @ 08:00 AM
link   

originally posted by: Gothmog
No one ever been to Sea World and had their picture faked riding on an Orca ?
22 years ago ?


NO one wants to see that porno


*shudders*


XL5

posted on Feb, 9 2018 @ 08:49 AM
link   
This gives me great hope for Goonies 2 and Ghostbusters 3. You know it won't be long before they put Trump in every movie, like Kindergarten cop.




top topics



 
5

log in

join