It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Google faces U.S., German probes on data collection

page: 1
2

log in

join
share:

posted on May, 19 2010 @ 08:33 AM
link   

Google faces U.S., German probes on data collection


www.cnn.com

The cars are supposed to only take photos of the street and collect basic Wi-Fi information, such as the SSIDs and MAC addresses of WiFi routers.

The Wi-Fi data was to be used in Google's location-based services, and Google argued last month that it only collected the same data that was publicly available to anyone walking down the street with a Wi-Fi device. Google insisted that it did not collect any kind of IP or packet data in the course of its Wi-Fi collections.

That turned out to be mostly untrue. The company announced last week that it discovered a "mistake" in the code being used to collect info and that it was, in fact, collecting some information on who was visiting what websites on which Wi-Fi networks.
(visit the link for the full news article)

Edit: added a tad more of the snippet.

[edit on 2010/5/19 by TLomon]



posted on May, 19 2010 @ 08:33 AM
link   
This blows my mind. There is no way this could have "accidently" happened. This had to be an intentional act of programming. How many laws were violated? Are the people who are enforcing these laws actually going to believe this BS? Definitely a story I plan to keep an eye on further developments.

www.cnn.com
(visit the link for the full news article)



posted on May, 19 2010 @ 09:06 AM
link   

Originally posted by TLomon
"Google insisted that it did not collect any kind of IP or packet data in the course of its Wi-Fi collections."


I can't believe a writer who uses 2 variants of the word "collect" in the same sentence gets paid by somebody. And what the **** kind of expression is "in the course of its collections" ?

No wonder the MSM is a walking corpse. Then again, they probably have a team of sleep-deprived, Monsanto-fed minimum wage slaves in some third world sweatshop cranking this stuff out with a few unpaid-internship zit-faced "editors" who know more about Xbox than Strunk and White's Style guide.

Just a guess. I know next to nothing about the reporting/media world, but I do know a thing or two about corporate cost-cutting debacles.

Anyway, to the actual topic at hand, Google needs to be stomped HARD for this. I use goog as much as the next man, and I like a lot of things about it. But they have made some serious pricacy gaffes in the last few years. Google earth and streetview are abominations. They belong in the 8th circle of privacy-violation hell (the 9th circle is reserved for Facebook).



posted on May, 19 2010 @ 11:27 AM
link   
At first I'd agree completely - there is no such thing as accidental payload collection, certainly when you're just scanning for the default broadcasted wifi routers. Collecting packets (whether TCP/IP, ICMP or UDP) is something different entirely.

To my disappointment however, I find myself buying at least a little into Google's explanation. The article on CNN links to an article on Google admitting the mistake leading to the admittance by Google, from which this excerpt:



So how did this happen? Quite simply, it was a mistake. In 2006 an engineer working on an experimental WiFi project wrote a piece of code that sampled all categories of publicly broadcast WiFi data. A year later, when our mobile team started a project to collect basic WiFi network data like SSID information and MAC addresses using Google’s Street View cars, they included that code in their software—although the project leaders did not want, and had no intention of using, payload data.


If there would be any sensible explanation for how they 'happened' to collect data, the above would be the one. Pieces of code being inadvertently duplicated is not impossible, even for a company like Google. Programmers are humans, too. Still, they would have had to be very inconsiderate in the development of the collection program; they failed to scrutinize imported or rewritten code, and subsequently implemented it without request.

For this to be possible, either the program was so small that noone cared about it enough to check it's functionality, and the 'implementation' happened through some incredibly sloppy copy-paste work; or, instead, the program (which apparently switched WiFi-routers 5 times a second) had its own drivers for the wifi-cards used in Google's cars. If Google actually wrote a kind of driver for their hardware so they had absolute control of its behaviour, then there is a lot more room for errors. It would be quite strange for a driver to save seemingly arbitrary pieces of payload, but if Google did write one, being a bit strange would be.. normal.

While collecting such small pieces of payload (switching routers, moving car) won't help Google in revealing much of anything, I wonder what that original piece of code written in 2006 was intended for. One could accidentally receive a HTTP request in 200ms, but surely that wouldn't be the goal.. or would it? *twilight zone theme*

Either way, it's very fishy.. very fishy indeed. On a brighter note, at least it got one of the biggest corporate entities to grovel a bit for us, even if only in vain:


The engineering team at Google works hard to earn your trust—and we are acutely aware that we failed badly here. We are profoundly sorry for this error and are determined to learn all the lessons we can from our mistake.

from googlepolicyeurope.blogspot.com...

Thanks for the info!



posted on May, 19 2010 @ 11:32 AM
link   
They can claim it was a mistake all they want, but for this to be true, they would have had to do 0 (zero) checking of the results - which means no real testing was done. Either they have an absolutely horrible programming department (let's roll out software that hasn't been tested), or this is a cover story on what was really going on.

I prefer to give people the benefit of the doubt, but this is an area I am quite familiar with, and the pieces just don't add up.



posted on May, 19 2010 @ 11:39 AM
link   
reply to post by TLomon
 


The results they were interested in were MAC addresses and ESSID's; those were stored in a database. As far as checking results go, they wouldn't necessarily care about anything else; as long as they got what they were looking for, other files could be easily overlooked. As one who is familiar with programming, you know that as a project gets larger, old and functional parts are left 'as is'. While this is more typical in 'horrible' programming departments, it's just a way to save time, and not completely impossible to happen at what is supposed to be a code gigant.

Just a nuance though.. you're right, the pieces don't seem to add up. To be honest, the only way for them to save their face would be to disclose the code written in 2006, and used just now..




top topics
 
2

log in

join