It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
A “top-of-the-line” automated facial recognition (AFR) system trialled for the second year in a row at London’s Notting Hill Carnival couldn’t even tell the difference between a young woman and a balding man, according to a rights group worker invited to view it in action.
Because yes, of course they did it again: London’s Met police used controversial, inaccurate, largely unregulated automated facial recognition (AFR) technology to spot troublemakers. And once again, it did more harm than good.
Last year, it proved useless. This year, it proved worse than useless: it blew up in their faces, with 35 false matches and one wrongful arrest of somebody erroneously tagged as being wanted on a warrant for a rioting offense.
n spite of its lack of success, the Met’s project leads viewed the weekend not as a failure, but as a “resounding success,” Carlos said, because it had come up with one, solitary successful match.
Even that was skewered by sloppy record-keeping that got an individual wrongfully arrested: the AFR was accurate, but the person had already been processed by the justice system and was erroneously included on the suspect database.
Studies bear out the claim that AFR is an inherently racist technology. One reason is that black faces are over-represented in face databases to begin with, at least in the US: according to a study from Georgetown University’s Center for Privacy and Technology, in certain states, black Americans are arrested up to three times their representation in the population. A demographic’s over-representation in the database means that whatever error rate accrues to a facial recognition technology will be multiplied for that demographic.
Beyond that over-representation, facial recognition algorithms themselves have been found to be less accurate at identifying black faces.
During a recent, scathing US House oversight committee hearing on the FBI’s use of the technology, it emerged that 80% of the people in the FBI database don’t have any sort of arrest record. Yet the system’s recognition algorithm inaccurately identifies them during criminal searches 15% of the time, with black women most often being misidentified.
Bio metric Facial Recognition To begin with, there are two forms of bio metric facial recognition today. The simplest systems are “dumb” – in the sense that all they do is take snapshots of everyone entering a premise or sitting at a table or using a machine. “Smart” systems go one extra step. They actively read the snapshots and compare them to a database that pro-actively alerts security. Even if you have been banned, on the dumber systems you can still sit and play – as long as you don’t make large bets or cause a big scene. If you do make a scene, their “dumb” systems may send over a security person undercover to take a high res picture of you and compare it to their database using a smarter system. If they have smart systems in place – you are screwed. Some state of the art systems espouse patents they gained from non-obvious designs. What do you think?
NORA – Non Obvious Relationship Awareness NORA is a system of both love and hate. NORA is the system that the Department of Homeland Security started using to identify terrorists and terrorism links after 9/11. Yeah – it can help find and defeat terrorists. But Blah – because with little more than a relationship it gives the government an excuse to trample on your liberties and rights. If you have facial recognition combined with NORA, it can show you how two people sitting at a table together might be related – even if it is because they were frat brothers 20 years ago and 5 states away. It can also show that the dealer is a distant cousin of said player who started hitting it big. NORA most recently goes by the name IBM Relationship Resolution, we personally liked NORA better. But you get the idea. NORA is designed to recognize in seconds relationships that humans could never know at first glance. NORA can also grab criminal records and local arrest records.
originally posted by: Shamrock6
a reply to: RalagaNarHallas
Casinos have been using the technology for several years. I remember a buddy of mine talking about it as early as the beginning of the 2000s. With a) more money to throw at it and b) nearly two decades of tweaking it, I don't find it surprising at all that they're miles ahead of the versions mentioned in the OP.
originally posted by: RalagaNarHallas
www.facefirst.com... seems to work for casinos but they may be throwing more money at it and use more cameras?
gnvqol.org...
Bio metric Facial Recognition To begin with, there are two forms of bio metric facial recognition today. The simplest systems are “dumb” – in the sense that all they do is take snapshots of everyone entering a premise or sitting at a table or using a machine. “Smart” systems go one extra step. They actively read the snapshots and compare them to a database that pro-actively alerts security. Even if you have been banned, on the dumber systems you can still sit and play – as long as you don’t make large bets or cause a big scene. If you do make a scene, their “dumb” systems may send over a security person undercover to take a high res picture of you and compare it to their database using a smarter system. If they have smart systems in place – you are screwed. Some state of the art systems espouse patents they gained from non-obvious designs. What do you think?
and NORA seems to be one of the more successful versions
NORA – Non Obvious Relationship Awareness NORA is a system of both love and hate. NORA is the system that the Department of Homeland Security started using to identify terrorists and terrorism links after 9/11. Yeah – it can help find and defeat terrorists. But Blah – because with little more than a relationship it gives the government an excuse to trample on your liberties and rights. If you have facial recognition combined with NORA, it can show you how two people sitting at a table together might be related – even if it is because they were frat brothers 20 years ago and 5 states away. It can also show that the dealer is a distant cousin of said player who started hitting it big. NORA most recently goes by the name IBM Relationship Resolution, we personally liked NORA better. But you get the idea. NORA is designed to recognize in seconds relationships that humans could never know at first glance. NORA can also grab criminal records and local arrest records.
few more links this one compares comparable softwear www.kairos.com...
www-03.ibm.com... IBM is throwing in voice recognition as well
originally posted by: TonyS
a reply to: Krakatoa
Really? So that stuff on the tv shows where they identify the perp by way of a CCTV shot is all bunk? What the hell point is there in having the damn cameras in the first place? So there's all these cameras down town and they've got some morbidly obese contractor dude watching six screens as he finishes off another cola and bag of cheetos and watches as some guys mug a woman and run off with her purse and get away without the cops being able to identify the perps?
Good grief..............what a friggin' joke that is.
originally posted by: nerbot
a reply to: Krakatoa
Perhaps they fooled you all me thinks.....
To use the system when others don't want them to, why not simply pretend it doesn't work by temporarily tweaking the alorithms, put it on show and the focus of attention shifts to something that presents more of a threat leaving free, functional use of the system in future.
Classic.
originally posted by: Krakatoa
A “top-of-the-line” automated facial recognition (AFR) system trialled for the second year in a row at London’s Notting Hill Carnival couldn’t even tell the difference between a young woman and a balding man, according to a rights group worker invited to view it in action.
Because yes, of course they did it again: London’s Met police used controversial, inaccurate, largely unregulated automated facial recognition (AFR) technology to spot troublemakers. And once again, it did more harm than good.
Last year, it proved useless. This year, it proved worse than useless: it blew up in their faces, with 35 false matches and one wrongful arrest of somebody erroneously tagged as being wanted on a warrant for a rioting offense.
London police’s use of facial recognition falls flat on its face
Wow, talk about a failed system. The police even declared it a "success"? REALLY?!?!?
n spite of its lack of success, the Met’s project leads viewed the weekend not as a failure, but as a “resounding success,” Carlos said, because it had come up with one, solitary successful match.
Even that was skewered by sloppy record-keeping that got an individual wrongfully arrested: the AFR was accurate, but the person had already been processed by the justice system and was erroneously included on the suspect database.
So, surely the American version by the FBI (that bastion of legal integrity) has a higher success rate since they have a much larger budget I am sure! Not so much it seems.......
Studies bear out the claim that AFR is an inherently racist technology. One reason is that black faces are over-represented in face databases to begin with, at least in the US: according to a study from Georgetown University’s Center for Privacy and Technology, in certain states, black Americans are arrested up to three times their representation in the population. A demographic’s over-representation in the database means that whatever error rate accrues to a facial recognition technology will be multiplied for that demographic.
Beyond that over-representation, facial recognition algorithms themselves have been found to be less accurate at identifying black faces.
During a recent, scathing US House oversight committee hearing on the FBI’s use of the technology, it emerged that 80% of the people in the FBI database don’t have any sort of arrest record. Yet the system’s recognition algorithm inaccurately identifies them during criminal searches 15% of the time, with black women most often being misidentified.
After these repeated failings I would guess the programs would be deep-sixed. No, they are doubling down now....and plans to continue it's use on a larger scale are likely in the works. After all, they need to protect their jobs in deciding to spend so much money on a failed technology. Which as I see it, they bought a bad bag of apples, and are determined to eat them even if people get sick and die in the process.
originally posted by: starwarsisreal
a reply to: Krakatoa
I'm pretty sure the US military have the more advanced version just only they kept it among themselves.
Facebook, according to the company, is able to accurately identify a person 98 percent of the time.