Help ATS with a contribution via PayPal:
learn more

Be a Civilian Hack for DARPA

page: 1
6

log in

join

posted on Jul, 24 2012 @ 03:26 PM
link   
The government has TOO MUCH data on you, and cannot figure out how to effectively leverage it.

Would *you* like to help?
Have you seen that commercial in the Starship Troopers movies? That is all this proposal is missing.


www.wired.com...


Darpa, partnered with George Mason University, announced Tuesday that it is now accepting proposals for the Innovation House Study, a challenge that aims to attract top civilian geeks to attack the problem of efficiently wading through and extracting useful information (such as people, places, things and activities) from massive piles of visual and geospatial data.






posted on Jul, 24 2012 @ 03:29 PM
link   
Yes please, lets help them put the chains on our wrist...

But i suppose there will always be people who place their own financial gain above that of the well being of society.



posted on Jul, 24 2012 @ 03:42 PM
link   
This seems to follow the general trend of out-sourcing for intelligence projects, but seriously? In the article they emphasize the cost-effectiveness of the approach. However, money is no object for DARPA.

I also get that this enables one to draw from a larger talent pool with a wider array of wild cards and sports, but that still doesn't really provide a solid enough reason for entrusting civilians with a project of this nature.

So, what is this about really? A P.R. stunt? The goal itself (data collation) a red herring long since resolved?



posted on Jul, 24 2012 @ 03:45 PM
link   

Originally posted by benrl
Yes please, lets help them put the chains on our wrist...

But i suppose there will always be people who place their own financial gain above that of the well being of society.


If you know how it works, you also know how to defeat it.



posted on Jul, 24 2012 @ 03:50 PM
link   
i want to know more

sounds just like our government today, first they want you to snitch
see something report it,

then every body is a terrorist.

now help us sort it out.


edit on 24-7-2012 by hounddoghowlie because: (no reason given)


now that i think about it, i should have said SERVICE GUARANTEES CITIZENSHIP Do You Want To Know More
edit on 24-7-2012 by hounddoghowlie because: (no reason given)



posted on Jul, 24 2012 @ 04:00 PM
link   
Competitions to crowd source how to effectively mine data about you faster and more efficiently, using fewer human eyes.

DARPA is using their own research on how to motivate humans to do their bureau's job. Very cost effective, leveraging their projects to perform new projects, and in the meantime refining how well their project outcomes work when applied.

It must be a dream come true.



posted on Jul, 24 2012 @ 04:30 PM
link   

Originally posted by benrl
Yes please, lets help them put the chains on our wrist...

But i suppose there will always be people who place their own financial gain above that of the well being of society.


That's certainly a possibility. I suppose it depends on the quality of informed oversite they get for their work.

If you are informed and have a big bat, then the chains might not be chains. How effectively you can police something like DARPA might be an interesting topic.

How well can you police professional mind-hackers?



posted on Jul, 24 2012 @ 04:33 PM
link   

So, what is this about really? A P.R. stunt? The goal itself (data collation) a red herring long since resolved?


I suspect that, as the CIA already has the software to abstract vast amounts of OSINT data and represent it in any human-readable form. Two examples are Recorded Futures and Palantir.

It wouldn't be cost-effective to hire specialists from outside the agencies. The people with the OSINT and analysis skills for this kind of position are already being hired by private firms paying double the salary being offered by the government.
edit on 24-7-2012 by XeroOne because: (no reason given)



posted on Jul, 24 2012 @ 05:18 PM
link   
Taking into account what SoE and XeroOne have put forth, is it safe to postulate that the primary aim of this project is to facilitate a study of crowd-sourcing dynamics? Nip the open source menace in the bud?



posted on Jul, 24 2012 @ 05:25 PM
link   
reply to post by Eidolon23
 


Unlikely, since they can already derive that from the vast amounts of data they aggregated. In fact, the CIA are so good at studying crowd dynamics they can predict global events and persons who will be of interest.
A more likely explanation is they want us to believe their capabilities are more limited than they actually are. In other words, too many people are complaining about the traffic being intercepted and archived, so the NSA counters that by claiming they can't sift through the data anyway.
edit on 24-7-2012 by XeroOne because: (no reason given)



posted on Jul, 24 2012 @ 06:11 PM
link   
reply to post by XeroOne
 


Oh yeah, huh? Shaaaahp as a tack, ^.

Feel like a bit of a dullard, but I guess I wasn't too far off in my initial call: it is totally a P.R. stunt.



posted on Jul, 24 2012 @ 08:33 PM
link   
I'd be more likely to say that the participation of people, how they participate, and the structure of the project itself is the real experiment.



posted on Jul, 25 2012 @ 08:27 AM
link   

Originally posted by XeroOne
reply to post by Eidolon23
 
A more likely explanation is they want us to believe their capabilities are more limited than they actually are. In other words, too many people are complaining about the traffic being intercepted and archived, so the NSA counters that by claiming they can't sift through the data anyway.
edit on 24-7-2012 by XeroOne because: (no reason given)

Well I had once watched a documentary about 2 to 3 years ago which indicated that NSA collected close to 5 Petabytes of data per month. The documentary also indicated the data collection points and the type of data that is collected. I can seriously imagine how much work it would require even for the aggregated data to be analyzed by the multi-lingual experts let alone processing the raw data into the more refined aggregated data. So at times, they just might be telling the truth though its a shame for the amount of money they spend every year, they can easily afford to hire few thousand additional heads to successfully analyze the data that is collected on a ongoing basis



posted on Jul, 25 2012 @ 10:55 AM
link   
So there are groups that have lots of data about you.

But having lots of data might not be as useful as people think. Lots of their information ends up available to the regular person too. Internet and library is full of data. What it is not full of is context, or the context is a lie or a partial truth.

This seeking for context would be way closer to being up DARPA's alley than the seeking for more pure data.

Essentially a data miner might be able to deduce that I use more feminine hygenine products on a particular day, but it takes a mind and some context to be able to use that information for some other purpose.

Turning data into informative narrative - or reducing inforamtive narrative to deduce data.



posted on Jul, 25 2012 @ 02:41 PM
link   

Originally posted by hp1229

Originally posted by XeroOne
reply to post by Eidolon23
 
A more likely explanation is they want us to believe their capabilities are more limited than they actually are. In other words, too many people are complaining about the traffic being intercepted and archived, so the NSA counters that by claiming they can't sift through the data anyway.
edit on 24-7-2012 by XeroOne because: (no reason given)

Well I had once watched a documentary about 2 to 3 years ago which indicated that NSA collected close to 5 Petabytes of data per month. The documentary also indicated the data collection points and the type of data that is collected. I can seriously imagine how much work it would require even for the aggregated data to be analyzed by the multi-lingual experts let alone processing the raw data into the more refined aggregated data. So at times, they just might be telling the truth though its a shame for the amount of money they spend every year, they can easily afford to hire few thousand additional heads to successfully analyze the data that is collected on a ongoing basis


And that problem was solved around 18 months ago by Recorded Futures, Maltego, Palantir, and several other 'relational mapping' software vendors, and they have OSINT specialists who don't even have to sift through much data anyway. As long as the data was aggregated in the right format, the software can do all the interpretation and abstraction. The only problem I can see is the amount of computing resources required to query a massive database.





new topics

top topics



 
6

log in

join