It looks like you're using an Ad Blocker.

Please white-list or disable in your ad-blocking tool.

Thank you.


Some features of ATS will be disabled while you continue to use an ad-blocker.


New AI can guess whether you're gay or straight from a photograph

page: 3
<< 1  2    4 >>

log in


posted on Sep, 8 2017 @ 12:30 AM

originally posted by: imitator
I bet... they will add this A.I. software to Facebook.

Probably, after all they run the best:
CIA book.

posted on Sep, 8 2017 @ 12:37 AM
I wonder how it would cope with those big butch leather man guys with their moustaches? (Magnum style)


edit on 8-9-2017 by Lagomorphe because: Crap spelling

posted on Sep, 8 2017 @ 01:07 AM

originally posted by: carewemust

originally posted by: Tardacus
a reply to: starwarsisreal

91% accuracy is pretty impressive so if it is 91% accurate in picking criminals that would help the police a great deal because the police can`t spot a criminal if they were sitting on the other side of that jelly donut that they are stuffing in their pie holes.

Notice the report said UP TO 91% accuracy. That could be anywhere from 1% to 91%. Maybe it was 91% for one guy.

You got it wrong. This is a Published Paper in a respected scientific journal. It's 91% and up to just means the algorithm was more accurate when it had more information. It said:

The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology and the potential for this kind of software to violate people’s privacy or be abused for anti-LGBT purposes.

So it was 81% of the time for men at first. It then says:

Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”, the authors wrote.

So it could be accurate 81% of the time with 1 picture but if it had 5 pictures of the person, it's accuracy went up to 91%.

These are Scientist not idiots and there's no way you could get a paper published with a random range of 1 to 91%.

posted on Sep, 8 2017 @ 02:44 AM
FINALLY. A working gaydar! Been waiting for this for years. /s

posted on Sep, 8 2017 @ 02:54 AM
a reply to: pirhanna

You know that's been mentioned more than once in this thread as well as in the article. You're redundant.

It's interesting though, that it seems to reinforce the existence of a biological basis for homosexuality.

edit on 9/8/2017 by Phage because: (no reason given)

posted on Sep, 8 2017 @ 02:59 AM
People are treating this like a laughing matter, but what happens when countries that outlaw homosexuality start using this technology? Here's a list of 76 countries that still outlaw homosexuality.

76 countries where homosexuality is illegal

Take this technology and mix in false positives & prison sentences (and sometimes death penalties), and you're left with a potential human rights catastrophe on a global scale.

posted on Sep, 8 2017 @ 03:15 AM
a reply to: enlightenedservant

That's pretty much what's happening already though, isn't it?
I'm mean, it's not like they get real trials. Is it?

edit on 9/8/2017 by Phage because: (no reason given)

posted on Sep, 8 2017 @ 03:25 AM
a reply to: Phage

Depends on the country. Some have it on the books but don't go out of their way to look for it (as in, only prosecuting it when couples flaunt it in public). But if their facial recognition software at banks, airports, checkpoints, etc can now have this feature, it will only increase the persecution.

The fact that it only seems to need pictures makes it even worse. Any law enforcement agency or intelligence agency can just scan the pictures of opposition leaders, activists, business leaders, & politicians right now, and then use the results for extortion. Even if the results were a false positive, the resulting scandal from the public accusations with scientific "proof" could be enough to ruin a career, provoke "honor killings", get people disowned from family, etc.

posted on Sep, 8 2017 @ 03:28 AM
a reply to: enlightenedservant

Yeah, pretty sure that in most cases a public accusation, with or without evidence, would have that effect anyway.

A tool, perhaps. But one they don't really need.

posted on Sep, 8 2017 @ 03:37 AM
a reply to: Phage

Here's a wiki article about the Gulf Cooperation Council's "homosexuality test" (HERE). It claims that in 2012 more than 2 million expats were tested because they didn't want to allow any homosexuals in because it's illegal. Last year, Human Rights Watch accused 8 countries of using forced anal examinations as homosexuality tests (HERE).

So a tool like the software in the OP would actually make it both more "humane" and more invasive than the existing tests because they could simply scan pictures of any citizens or tourists that they choose. The point I'm making is that it gives them more scientific cover to do this to people.
edit on 8-9-2017 by enlightenedservant because: (no reason given)

posted on Sep, 8 2017 @ 04:18 AM
Can someone please, please, PLEASE take this AI to wherever the next WBC protest is going on?

With as much hatred as those idiots have towards the LGBT community, you'd think that at least a few members would be hiding deep in the closet. I would pay good money to watch them turn on each other

posted on Sep, 8 2017 @ 07:29 AM
a reply to: enlightenedservant

"Forced anal examinations as homosexuality tests"

That's rape any day of the week!

Plus the hypocrisy is oozing out of such an ordeal, whats the bet it's a few big blokes with handle mostach and repressed homosexual tendencies that perform such acts of depravity???

End of the day people are people and their sexual preference should be nobody's business but their own, certainly not some other Man or machines.

posted on Sep, 8 2017 @ 07:50 AM
a reply to neoholographic

i sure hope it's better then the facial recognition software that the police in the londan used just recently.

A “top-of-the-line” automated facial recognition (AFR) system trialled for the second year in a row at London’s Notting Hill Carnival couldn’t even tell the difference between a young woman and a balding man, according to a rights group worker invited to view it in action.

edit on 8-9-2017 by hounddoghowlie because: (no reason given)

posted on Sep, 8 2017 @ 08:02 AM
a reply to: hounddoghowlie

Maybe the poor lady was, unfortunately, a touch on the masculine side with a bought of Alopecia.

Plus if you are going to attempt to fool or spoof any facial recognition software the variety and spice of life displayed at such a festival is really the perfect testing ground to tax it's capabilities.

Alas on that occasion it seems to have fallen on its arse somewhat.
edit on 8-9-2017 by andy06shake because: (no reason given)

posted on Sep, 8 2017 @ 08:39 AM
I hope someday AI can predict my death.

posted on Sep, 8 2017 @ 09:35 AM
Next step:
Use said software during trial for homophobic hate speech. I bet 90% would be gay.

posted on Sep, 8 2017 @ 09:41 AM

originally posted by: Tardacus
a reply to: dreamingawake

well, most women are naturally bi sexual anyways, right?

that might be because as infants their survival depended on sucking on another females breast.

If males were born gay, I don't think they would take pleasure in sucking from the breasts of a beautiful woman..aaaaaand...I will
right there.....

posted on Sep, 8 2017 @ 09:51 AM
Read the article carefully - it's facial recognition and grooming style - so it's not all based on a set facial structure. I wonder how much is grooming style and how much is facial structure?

Also, it's based on a dating site where people are publically out. It's not based on their work or family picture and it's not based on people who aren't public or already in a relationship.

posted on Sep, 8 2017 @ 10:04 AM
Interesting project, but it's not AI. It's facial-recognition software, tweaked and fiddled with a bit. It might one day have applications in AI, but I wouldn't hold your breath if I were you.

Not sure of the implications of this. A person's sexuality is a private matter, unless they choose to make it public. On the one hand, do we want 'robo-outings' of closeted people wrecking their lives? On the other hand, if what other research shows about homophobes is based in truth, namely that they have positive sexual responses to gay porn, who wouldn't doubt that their hypocrisy needs to be exposed?

Some very philosophical questions lurking behind all this.

posted on Sep, 8 2017 @ 10:59 AM

originally posted by: DBCowboy
a reply to: neoholographic

hahahahaha. . . . .

They finally invented "Gaydar".


Somebody call Dwight it has arrived!!!!

new topics

top topics

<< 1  2    4 >>

log in