It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Emotion reading technology claims to spot criminals before they act

page: 1
7

log in

join
share:

posted on May, 10 2017 @ 06:56 PM
link   
This is interesting technology that can do a lot of good but it can really be abused.


Emotion reading technology could soon be used by police after a Russian firm created a tool that can identify people in a crowd and tell if they are angry, stressed or nervous.

The software, created by NTechLab, can monitor citizens for suspicious behaviour by tracking identity, age, gender and current emotional state. It could be used to pre-emptively stop criminals and potential terrorists.

"The recognition gives a new level of security in the street because in a couple of seconds you can identify terrorists or criminals or killers," said Alexander Kabakov, NTechLab chief executive.

The emotion recognition tool is a new part of NTechLab's facial recognition software, which made the headlines last year when it was used to power the FindFace app that can track down anyone on Russian social network VKontakte from a photo.

The identification app claims to have reconnected long-lost friends and family members, as well as helped police solve two cold cases and identify criminals.


www.telegraph.co.uk...

First off, the false positives will initially be through the roof as the system learns to separate people who are stressed, angry and emotional over losing their job vs a terrorist or a criminal on the run.

The A.I. will eventually learn how to separate these things but it will take a lot of data.

The Find Face app has been used to help police but this can really be abused. You can snap a photo of someone and find out everything about them in minutes.


A facial recognition app that can work out the identities of strangers in a crowd by matching their faces with profiles on social media is taking Russia by storm.

In just two months FindFace has gathered 500,000 users who have run nearly 3 million searches, according to its founders.

Russian photographer Egor Tsevtkov wanted to know how much personal data he could find out about complete strangers on the underground. With this idea in mind he photographed a selection of strangers, and used a free facial recognition software with terrifying results.

The software he used is a website called Find Face, which lets users look up people online using a photo, and accurately matches them up with a Facebook profile, based on their faces.

"I learnt a lot about a person's life without any contact," said Tsevtkov. "I felt slightly uncomfortable."

"I photographed people who were sitting in front of me in the subway, and then looked for them in social networks using open source software," said Tsevtkov, who called the resulting project called "Your face is Big Data".


www.telegraph.co.uk...

This is from the Companies website.

NTechLab was founded in 2015 by Artem Kuharenko to create algorithms as intelligent as humans and as efficient as machines. Our team uses the most advanced techniques in the field of artificial neural networks and machine learning to develop smart and innovative software products.

ntechlab.com...

Imagine a technology that can track people down from a photo. So a woman running from a creep who hits her wouldn't be able to hide. You can find anybody just about anywhere with just a photo.

Criminals would have to hide deep in the woods to try and hide. In another post it was talked about how an A.I. was developed that can even use computing power of nearby machines. It would be like that movie with Denzel Washington called Fallen where the evil spirit or whatever it was could just leap from person to person with just touch.

This A.I. could follow you around using the computationial power of other devices. You would have to throw away your phone and hide in a sparse or wooded area.



posted on May, 10 2017 @ 07:01 PM
link   
#ING HELL



posted on May, 10 2017 @ 07:43 PM
link   
I think this AI will have a heart attack if it sees me. It will turn all religious because of it's fear.



posted on May, 10 2017 @ 07:58 PM
link   
a reply to: neoholographic

Sounds very Orwellian to me, such technology will inevitably be abused by tptb.

edit on 10-5-2017 by andy06shake because: (no reason given)



posted on May, 10 2017 @ 08:34 PM
link   
a reply to: neoholographic

Nothing like pre-crime to bring on the totalitarian state.

This is a horrible technology and idea.



posted on May, 10 2017 @ 08:44 PM
link   

originally posted by: neoholographic
This is interesting technology that can do a lot of good but it can really be abused.


Though it can't stop dangerous people who know how to control their emotions. Show me a pattern-recognition AI that can spot the person with so much emotional control and will they could do anything, and I'll show you nine figures of VC. Or maybe just another missing person and missing IP.



posted on May, 10 2017 @ 08:49 PM
link   
a reply to: neoholographic

They cant even tell which cons are going to get out and strike again. This is some Segway into facial rec and by good the good little boys and girls will get with the program.



posted on May, 10 2017 @ 09:06 PM
link   
a reply to: neoholographic




posted on May, 10 2017 @ 10:11 PM
link   
a reply to: neoholographic

Frankly, when I read here in the states that a criminal that has fled the area is found in some other area of the country, I wonder if that system is not already active in the US. 'Course, the growing use of police casually collecting license plates numbers with automated driveby systems would be a big help in finding those people but facial recognition software may already be employed.

Don't dismiss the possibility out of hand. When you drive under a toll road camera it not only takes a picture of your car and plate, but also of the driver.
.



posted on May, 10 2017 @ 10:38 PM
link   
Wouldn't this qualify as the worst kind of profiling possible? Or the flip side, it sounds like the county sheriff who said he pulls people over for doing everything right because its so suspicious, like they are trying to hide something...



posted on May, 11 2017 @ 12:31 AM
link   
Outrage.



posted on May, 11 2017 @ 03:57 AM
link   
I wrote a thread on this in 2012 , called 'Thought Tools Tracking Mind Waves' . It was a no flags stars or replies . I'd written it as directed to the developers of this technology , warning them or any reader that justice must be the ultimate end towards which the application is put . That it can't and in fact won't be used for ill purposes , at least in the long run . Scum always floats , and the gold always always sinks to the bottom of the pan . There's no avoiding the truths in life . It has been that way for a long old time too , and nobody , simply nobody , can controvert this .



posted on May, 11 2017 @ 07:59 AM
link   
a reply to: neoholographic

We've got so much tech following us about now but nothing much has changed regarding crimes such as terror attacks, rapes, murders etc.

what makes you think this tech will work? All the above still occur on a daily basis with all the cctv around.

Also some people wish to remain anonymous... what right does someone have to take that anonymity away? I see lots of suing going on in the future.
edit on CDTThu, 11 May 2017 08:01:31 -05000000003108x131x1 by TruthxIsxInxThexMist because: (no reason given)



posted on May, 11 2017 @ 10:05 AM
link   
a reply to: neoholographic

This is going to suck for people who suffer from 'resting bitch face'.





posted on May, 11 2017 @ 11:24 AM
link   

originally posted by: Namdru

originally posted by: neoholographic
This is interesting technology that can do a lot of good but it can really be abused.


Though it can't stop dangerous people who know how to control their emotions. Show me a pattern-recognition AI that can spot the person with so much emotional control and will they could do anything, and I'll show you nine figures of VC. Or maybe just another missing person and missing IP.


It's going to miss the really bad guys. The worst habitual criminals are sociopaths/psychopaths who have naturally low arousal. That's part of why they "thrill seek" with criminal activity. I'm not just talking about serial killers here, I mean everything from drugs and petty thievery on up.

This technology will pick up on a bunch of anxious people that have not, likely will not and certainly aren't about to engage in criminal activity for the most part IMO.

I don't think the purpose is to catch criminals per se at all.



posted on May, 11 2017 @ 12:11 PM
link   
I'm reminded of the Twilight Zone episode in which a bank teller can suddenly read the thoughts of people. He begins to hear the thoughts of a coworker, who (in his mind) is considering stealing money from the bank and fleeing to Bermuda.

It turns out that the man has had this same fantasy everyday for all the years he has worked in that bank of stealing some of the money that he sees move past him everyday, but of course would never act upon that fantasy.

Thoughts and emotions we know we should NOT act upon are just as real as the thoughts we do act upon. There is quite a bit of "reasonable doubt" that could be claimed by anyone who is caught and arrested by using this technology.


edit on 11/5/2017 by Soylent Green Is People because: (no reason given)



posted on May, 11 2017 @ 05:32 PM
link   
I'm curious to know what kind of criminals this technology is supposed to catch. The ones who act out of passion? Sorry, but most violent criminals don't operate like that.

They're largely sociopathic; they do not experience emotion...although they're highly skilled at fooling other people into thinking they do. A machine won't catch them, because they don't believe what they're doing is wrong in the first place, therefore it requires no emotional investment. They detach from people. View them as little more than puppets...something to toy with. People who are thinking about committing a crime out of desperation or other intense emotional factor, yes. I could see that.

But at what cost? If we allow tech like this to be implemented, it's only a matter of time before someone uses it to either control or cause harm to other human beings. That is inevitable and unavoidable. The risk to personal freedom and privacy is huge here. The possible ways that this could be used for nefarious purposes are many.

Everyone is so worried about AI becoming truly sentient....but the bigger threat, in my opinion, is the entities who would be entrusted to use it without abusing it, and the consequences of choosing to trust the wrong people with that responsibility. No matter how much good might come from using tech like this, the potential for the exact opposite happening is just too high.



posted on May, 11 2017 @ 10:55 PM
link   
I never thought I'd ever want to wear a Burqa, but after reading this, the idea has some appeal. I'm not even thinking about law enforcement being a problem. It's unlikely they would bother me with this technology at this point in my life. I don't like the idea of some of my relatives tracking me down with this. I've got some really annoying relatives. The last thing I need is for them to track me down and start bugging me again. Facebook used to be a fun way to keep in touch with a few of my college friends. Now it's gotten so intrusive and invasive and creepy, that a lot of us don't enjoy using it and have gone back to group emails and hand written letters and notes. I suppose that dork who created Facebook will find some way of infiltrating all of that with his tracking schemes, too.



new topics

top topics



 
7

log in

join