It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Since its inception, the Department of Homeland Security has promoted modern technology as a way to save the nation from terrorism, and it’s done so in part by emulating the Pentagon’s preoccupation with science and experimentation. Some of the country’s most significant achievements, in fact, were conceived by pioneering researchers the government hired to help give warfighters an advantage over their enemies.
The Defense Advanced Research Projects Agency, or DARPA, performed many of those tasks, taking credit in part for the Stealth Fighter and pilotless drones, as well as other advancements that are slightly more civilian in nature, like the Internet. DARPA is also frequently cast in popular culture and science fiction as the government’s secret laboratory for building disturbing tools that can do things like read our minds.
That distinction now belongs to DARPA’s sister, the Homeland Security Advanced Research Projects Agency, and officials are suggesting they’re closer to making the unthinkable a reality. Here at Elevated Risk, we’re reluctant to lead the tin-foil hat and 9/11 Truth crowds into fanciful conspiracy theories, but there’s no doubt the Department of Homeland Security believes it’s possible for the right technology to “sense” human intentions.
Among a list of experimental achievements listed in the department’s budget request this year is the testing of a “real-time malintent detection capability,” or machinery that can measure things like heart rate, micro-facial expressions, breathing patterns and body heat as an individual walks through a security portal. Software algorithms would determine if a combined set of behaviors and physiological qualities amounted to someone hiding plans to carry out a terrorist attack.
“It actually seems to be working,” Robert Burns, program manager of the Future Attribute Screening Technology, as it’s known, told an industry publication last year. Experiments have been conducted on hundreds of people so far.
Writing for the United States Naval Institute in October, another official from the homeland security division that houses HSARPA said the program “aims to increase the accuracy and validity of detecting people planning destructive acts based simply on how they present themselves and behave in a given situation. The theoretical foundation for this program is the relatively new theory of malintent, which proposes that individuals with the intent to cause harm may experience and display distinct emotional, cognitive, and behavioral cues or signals.”
Behavioral detection is nothing new for airport security officers, and the Obama Administration wants to hire hundreds more people this year who specialize in looking for subtle bodily reactions that human beings have difficulty controlling when they fear misbehavior will be discovered. Directing a machine to know when someone should be singled out by authorities, however, is altogether different for a country that’s spent two centuries frowning upon excessive government intrusion.
National Defense magazine said late last year that the department had spent $20 million on research for the project so far, and scientists tested the technology at a September conference in Massachusetts using study participants: