It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

The Secretive Company That Might End Privacy as We Know It

page: 3
40
<< 1  2    4 >>

log in

join
share:

posted on Jan, 21 2020 @ 01:40 AM
link   
a reply to: Riffrafter

thanks for the example.
They even have some police agencies around the world employing people who are called 'super recognizers' who, just by eye, pick out individuals based on their movements and are essentially human 'facial recognition' machines to boot that sit at cctv monitors or review footage to identify people wanted by authorities.

The Detectives Who Never Forget a Face

I think privacy is going to be completely dead in a few years, but of course many shall justify it with the old favourite excuse of:

"If you're doing nothing wrong then you have nothing to worry about."


But yes, my money is on people will complain about it and nothing else.



posted on Jan, 21 2020 @ 05:39 AM
link   

originally posted by: hounddoghowlie
a reply to: DBCowboy

stop and frisk always put me in the mind of profiling. which as a investigative tool i'm not really against.


u r also a little bit rascist arent u...



posted on Jan, 21 2020 @ 05:52 AM
link   
a reply to: dogstar23

i completly agree with u.. it just makes it alot easier to prosecute unjustly and creates a whole lot more potential malpractice.. as if law enforcement is a God that deserves all the power and never commits mistakes...
imagine how a technology like this can be used in 3rd world governments.. its only negative outcomes..
The natural way is karma... leave it to karma..



posted on Jan, 21 2020 @ 08:48 AM
link   
a reply to: dokhas




u r also a little bit rascist arent u...



your also a little bit of a iggy moe aren't you. did you even read the whole post? you just thought that this post is a chance to get some stars by calling someone you don't know a racist....

edit on 21-1-2020 by hounddoghowlie because: (no reason given)



posted on Jan, 21 2020 @ 09:01 AM
link   
a reply to: hounddoghowlie




your also a little bit of a iggy moe aren't you.


Iggy Moe?

Ha!



posted on Jan, 21 2020 @ 01:25 PM
link   
Sure its been said already but every time I hear about these things in detail even law enforcement programs, i hear it tells the things you've bought in the past weeks it tells what kind of sex you search for online what kind of pornography you watch your sexual orientation the different people you hate versus get along with the groceries u buy the food you eat the weight that you fluctuate I mean it's crazy they know such intimate things about you at the drop of a hat and it's just sick because I mean who wants to be getting arrested by a bunch of cops who are outside the payrol car with your info page up giggling about how this guy is a queer and he likes I don't know Teletubby porn how humiliating to pile on top of something when you just say being arrested for an unpaid parking ticket
edit on 1/21/2020 by AlexandrosTheGreat because: (no reason given)



posted on Jan, 21 2020 @ 02:13 PM
link   
a reply to: AlexandrosTheGreat

Sadly it's all information we willingly put put out there on the world wide web. Maybe people should do a few things if they fear such things.

Read those 'terms of service', 'End User License Agreements' and those "This site uses cookies" pop-ups BEFORE they click 'sign up'.
Can anyone genuinely complain when they agree to the terms that clearly state how a company may use the information you post and upload to their site?

Get educated about the tech you use and use it properly. More to computers and web browsers etc than just pressing the on button and typing. If your TV is 'smart' and is capable of listening to you, do you know how to disable such features? Consider buying a 'dumb' TV maybe.

Learn to leave a small to non-existent digital footprint online. VPN's Proxies, Ad-Blockers, Public DNS etc. exist for a reason.

Don't upload or post anything to the net anything you wouldn't do in person, that you're not willing to own.
If you wouldn't say racist things in public, if you wouldn't wave your dick around in the street, don't do it online. Applies to anything you wouldn't do with your own face toe to toe with another human.

Avoid websites that require you give them your phone number or drivers license etc. or real life information.

Only use websites that have an SSL certificate and similar security. (That's Above Top Secret out then hey?)

Take responsibility for yourself and your actions. Apply common sense to the things you do when using tech.
Don't blame everyone else for what YOU choose do to with what you have.

If you don't like a company and their practices don't use their services.

If something about laws seem wrong, don't impotently bitch online about it. Go get the laws changed. Write a letter to you MP, Senator or countries equivalent.

And the most important one of all. DON'T USE THE NET OR 'SMART' TECH TO BEGIN WITH.


End of the day, most of what people can do to us we allow and permit to happen to us. Either willingly or through laziness.
We all have choices. Nobody makes us use the net, nobody makes us own computers, phones and so called smart devices. Nobody holds a gun to our head and makes us post what we post, upload what we upload, or visit the sites we visit.

I joined ATS because I wanted to. Springer didn't threaten me into joining his site. I am aware of the massive security flaws in ATS' webpages yet despite knowing that here I am. Because I choose to be here.

I know what facebook does with my data, yet still I use it, hence why I don't bitch about it.
If I had a problem with such things I would do something about it, because you know, I'm an adult and only I control my life, if anyone else does it's because I let them.

Then again I took the time to understand the technology I use. Read the manuals and help files. I know how to protect myself online when I need to and everythiong else is from my own doing.
I stand by everything I put on the net.

As is usual with many of my ATS posts and I'll be happily surprised to get an answer for once from someone on here, I'll ask my favourite never answered question.

If you don't like this facial recognition technology. If you don't like this big brother world we've created. What are YOU going to do about it?

One day I'll get an proper answer to that question. But I don't dare hold my breathe waiting for it.

My answer is and always will be involving ones backside and removing it from the seated position.
edit on 21-1-2020 by AtomicKangaroo because: typos

edit on 21-1-2020 by AtomicKangaroo because: (no reason given)



posted on Jan, 21 2020 @ 02:33 PM
link   
Oh yeah. And remember freedom of speech only applies to public things.
No such thing when it comes to private companies and such, and it is also not a right in every country.

If only I had a dollar for every time I hear a fellow Aussie or non-American go on about their non-existent rights to freedom of speech as if they actually had them I'd be a rich man.

Amazingly a lot of peoples rights around the world only exist in their heads.



posted on Jan, 21 2020 @ 03:26 PM
link   
a reply to: ArMaP
Dude... this is old news. People have been talking about this for many months on end now. Keep up...



posted on Jan, 21 2020 @ 04:24 PM
link   
a reply to: ArMaP

If anyone has put their # public on the internet or given it to other parties that do the same they can't complain



posted on Jan, 21 2020 @ 06:20 PM
link   

originally posted by: IanMoone2
The 'trick' to this AI working is that these images also contain self identifying information (IE: Tag your friend (or yourself!)) Pictures taken purely in public have a much higher degree of uncertainty in accurate identification of any given subject without that identifying information being present. Traffic cams are an interesting case. It can be argued you can be identified by the license plate... but that would have to be corroborated through DMV records (photo ID) or another means, as people often share/loan vehicles.

From what I think I understand of it, self identifying information helps, but it's not needed.

The article says that a Clearview AI's sales presentation states that in one case a sexual child abuser was caught because he appeared in the mirror in someone else's photo. When looking for someone for which you have absolutely no clues, any information helps. If the case mentioned above is true then I'm not surprised, as knowing that some guy was at that gym when the photo was taken could give an indication of who the guy was (or wasn't, negative identification may, sometimes, be helpful).



posted on Jan, 21 2020 @ 06:37 PM
link   

originally posted by: Maroboduus
a reply to: ArMaP
Dude... this is old news. People have been talking about this for many months on end now. Keep up...

Could you point me an example?

Thanks in advance.



posted on Jan, 21 2020 @ 07:23 PM
link   
So, we have passed from the idea to the actual realisation. The walls of individual privacy have finally been made malleable and translucent, and your life now laid bare like an open book. This is something that can't be dis-invented. In fact, it places all social media sites like Facebook into that iconic adjuration by the spider to the fly, and the people, as the flies, stepped right into the web by their own accord and volition.

Social media sites were clearly set up as the precursor databases that were waiting for the technology to come along that could use their databases in a way that was always intended right from the start.

Of course, the issue is now, Clearview's technology will be improved upon year after year, as other companies take up the technology and tweak it for their own use. It won't matter if the PR spin of government says it will legislate for who can use such technology, it will become ubiquitous and used by all groups to stealthily profile anyone of interest to them.

Whatever rights you think you have, and no matter what rights you can quote as protecting of your individuality and privacy, no longer have any weight or credibility. The very existence of this kind of technology, its very usage, makes all rights redundant, because what is stated in the public sphere, will not be mirrored behind the doors of authoritarian control.

If you are considered a threat, even from the day you are born (due to your parent's affiliations), you will be watched and observed via a tier system of security. Even if you grow up to be the most compliant and obedient citizen, affiliations past and present will profile your future, and determine the kind of doors open to you to make a life for yourself. Freedom of choice has just been taken away from you. You will never know if your successes or failures were really down to you, or by those observing you insidiously and perniciously in the background.

This technology puts everyone on the list simply by affiliation, and it won't matter if you don't, or never have used social media sites.

You know what, I'm glad I am reaching sixty years old, and that I am coming into the short twilight of my life, because I don't want to be around when this # becomes the norm. The direction we are heading in does not bear thinking about.
edit on 21/1/20 by elysiumfire because: (no reason given)



posted on Jan, 22 2020 @ 08:17 PM
link   
a reply to: AtomicKangaroo


well stated

I would also add two other things.

one... the easier you make something to access, the harder you make it to protect.

the other is this (and I know it will cause a quano storm)...

if your not a WANTED CRIMINAL or have done something CLEARLY CRIMINAL... then you have ZERO TO FEAR.



lastly I say is this

banning it because someone (be police, government, business) will misuse it is silly
evil will use ANYTHING EFFECTIVE to do their dastardly deeds.. thats a fact

if you fear "police/government misuse" then you ask for stronger protections or ENFORCEMENT of the law.
you dont say "we have to ban it" ... because it is EFFECTIVE in catching criminals.

scrounger



posted on Jan, 24 2020 @ 12:07 AM
link   
a reply to: scrounger

This is why I have an issue with the "If you're doing nothing wrong, you have nothing to fear." way of thinking.

It's simply that such systems are open to abuse. People can, will and have used such things to make people a criminal if it suits them and they're capable.

You have to be 110% sure a service is secure and protected from abuse. Like how facebook originally was selling peoples data, photos etc to companies, then people were seeing their faces showing up on 'meet singles in your area' ads and more, making FB billions in the process.

We need to protect the naive too. Sad fact is much of this kind of thing exists and happens because the majority of the people using these platforms are completely clueless about what they are actually signing up for.

They don't read the TOS, The EULA's etc. Technically it is on them to do so but the fact is people are 'TL;DR' and don't want to wade through 30 pages of terms to find the single sentence buried in the document that says by clicking 'agree' they've just sold their first born child.

The majority of folks want simple. Click a button or two and be done. Even reading a single paragraph ToS is too much work for them.
The companies know this too and this is why it happens.

Need to close the loop holes. Things should be 'opt in' by default not 'opt out'. And government and proper regulation is the way to do it.

People complaining to government, actually making the effort is why you can now easily get a copy of and download all the information companies like Facebook have on you. People might laugh at European nations and places like Australia and how we do things, but we tend to look after our citizens and their rights as 'consumers'.

I quote 'consumers', as many who use facebook just don't realise they're not customers, they're not the user, they are the product being sold.
So the only solution is educate people who don't want to be educated or let the government take care of it like we pay them to do.

But the government will only do something about it if we tell them it's a problem.
People complain these days about how something sucks, and the government sucks for not doing their job. Cannot make them see it's because of complaining officially to the government or relevant authorities they just get on Facebook, Youtube, Instagram, ATS, and such and to whine and bitch.

Government isn't your facebook friend, they don't subscribe to Paul Joseph Watsons Yioutube channel.

If you want them to change the laws or shut down companies that not in the tax paying public best interest. If you want your privacy back you have to get off your ass and meet them in person, and write emails and letters. Make your feelings very clear to them.
Get others to join you and help. Go to the media.

This is why you hear nothing but SJW, Antifa, Left/Right, Politically correct from a overly vocal minority of humans.
Because as I've said before around here, SQUEAKY WHEEL GETS THE GREASE.

Get off the net and go direct your anger, frustration or disappointment where it needs to be directed people.

This tech is a double edged sword. Misused it's a potentially bad authoritarian tool and abuse against peoples rights to privacy. Used correctly it could help reduce REAL crime as long as it is only used to investigate crimes and not to monitor people 24/7 because 'every ones a potential criminal'.

I mean there are plenty of stories out there of young NSA/CIA types going through the everyday American's photos looking for nudes just for sh*ts and giggles. Hardly what one would say is using the tech for fighting terrorism.

But it happens because we allow it to. That simple. Billions of us, thousands of them.
If we all stood up and together and said "Hey we don't want this technology" and won't take no for an answer, it would not be an issue.
But we won't. The same way we keep electing the same idiots over and over instead of telling the lot of them to piss off.

Just blows my mind this species will make wonderful scientific tools, then find ways to make life worse instead of better with it.
Why aren't we better than this?

I'll never understand most humans need to screw each other over instead of getting along and doing right by each other.

Somethings gotta change. But it won't until we make it change.

edit on 24-1-2020 by AtomicKangaroo because: (no reason given)



posted on Jan, 24 2020 @ 02:18 AM
link   
a reply to: AtomicKangaroo

i agree with alot of what your saying .

but the problem you have in your whole concept is stated in you line "110 percent sure"

with respect NOTHING is that and no system ever developed is perfect.

to ask for much less expect is a fools errand.

along with to demand it before (using this example) the tech should be implimented is unrealistic and would prevent it EVER being used for good.

because in reality (as I previously stated) evil men/women with evil intent will use ANY TECH , people , ect if it helps their goals.

to try to ban its use for anyone because of the misuse potential is just preventing the good of its use and not doing a damn thing to stop the evil.

I do agree that if good people ban together you do get better laws, rules, and overall outcomes.

but evil is gonna exist and your banning any particular (yes there are some limited exceptions) tech that does (as this HAS) good because of the potential abuse is not a fair or realisitc .

along with the other glaring issue is this tech / genie is "out of the bottle"...

you aint putting it back and you aint gonna stop its use.

the choice (as much as I may wish different) is use it for good or dont use it and bad/evil people will

its simple as that

scrounger



posted on Jan, 24 2020 @ 02:21 AM
link   
a reply to: AtomicKangaroo

if I may also add and sadly agree

the idea of "privacy" that we enjoyed decades back is gone.

our demand for quick access, convienence and dare say some narcisistic need for attention in the social media world has caused this downfall.


if any change is to occur realise going back as was isnt gonna happen...

being able to take some back can, but will require effort , self control and dare say loss of convienance you now enjoy

scrounger



posted on Jan, 24 2020 @ 03:46 AM
link   
I'm not concerned with privacy. I'm more concerned that flawed AI will start giving people life sentences for something they didn't do. It will be really plausible...just like AI cars that can cause a fatal "accident".



posted on Jan, 24 2020 @ 04:18 PM
link   
It gets better...read the article.

www.bbc.co.uk...

As physical hardware technology has converged over the decades, software technology also converges. Here we have hardware and software technology that has been trialled and is to be deployed to facially recognise persons in the street. The deployment is made out to be beneficial and unintrusive, but necessary, and yet, regardless of who you are, criminal or non-criminal, with each of these facial-recognition cameras, you will be profiled, and details about you built and kept on file...forever. Data will never be deleted.

So, as you walk down the street, cameras will no longer just record and show your presence, they will now scan you and look you up on the global databases of social media sites, and from that scan, a history of who you are will be built up with a certain amount of accuracy. All affiliations will be cross-referenced, all websites you have ever visited noted. All of this in a matter of minutes. Of course, 'if you've nothing to hide, you have nothing to fear'. Have you?

Well, actually you have everything to be afraid of. If the roll out of this surveillance technology does not have you concerned, you don't understand the concept of individual privacy and how it relates to freedom and liberty. For a start, privacy is the last wall that resists unwarranted intrusion, yet, it was actually the first wall to fall, and without your consent. Clearly, with the roll out of this technology into the public sphere, it must be believed that the public have now habituated to the idea and accept that security is more important than freedom. That the role of the police has shifted from that of protector to that of surveillor, and that it is good.

The implications are staggering! There is nothing benign about this technology and how it will be used across the whole spectrum of society. Yesterday, the film 'Minority Report' was science fiction, today, aspects of its fiction are now science fact. It won't be just the security agencies using this tech. Corporations, employment firms, schools and colleges and universities, shops and stores and marketing. The next step will be to mandate them into your home, because cameras on the streets will still not be enough to counter terrorism and criminality, so it will be claimed. This is the road now taken.
We know now for certain where we are heading.

Governments elected as benign managers of countries are now in the process of turning tyrannical, and after that, all they can do, to maintain control is turn totalitarian. Always bear in mind, a good country, the most greatest, the most free and libertarian, doesn't or won't ever need such technology. So where does the roll out of it place us?



posted on Jan, 26 2020 @ 12:51 AM
link   

originally posted by: KiwiNite
I'm not concerned with privacy. I'm more concerned that flawed AI will start giving people life sentences for something they didn't do. It will be really plausible...just like AI cars that can cause a fatal "accident".


AI cars are fatal because the human safety feature failed in most cases. aka the 'driver'

Seen plenty of youtube videos where someone has lost control of their 'smart' car, and whine about it, despite the car clearly telling them that the 'human' needs to take over and them choosing to ignore it.

Driverless cars are great, but they're not designed for you to be able to ignore the traffic and environment and sit on facebook. Not at this stage at least. You still have to pay attention, at least to your AI 'co-pilot'.

Sure it can fail on its own, but the majority of failures, as is usually the case, have come about due to human error.

Same applies to AI algorithms monitoring humans. They can only do what humans have told them to do and allow them to do.

The technology is really not the problem, it's the humans using the technology and how they use (or misuse) it.
If an AI gets the power to judge and sentence humans, it'll only be because a human allowed it to.
Although I doubt we'll see an AI in such a position anytime soon.

At the moment we'll see AI being used as Police, investigating crimes and scouring evidence and facts with a fine tooth comb and maybe being used to put together a case. But it won't be put into a Judge/Jury position for our foreseeable future.
(If only because judges and lawyers won't see anything cutting into their over paid salaries. lol)

So relax, the AI sky is not falling yet. We have plenty of time to campaign those we pay to run our nations to enact laws and regulations to prevent the abuse of such tech.

Of course, if we can make the effort to get off our butts and do so that is......



new topics

top topics



 
40
<< 1  2    4 >>

log in

join