It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Facebook's social algorithms discourage exposure to different viewpoints

page: 1
8

log in

join
share:

posted on Jun, 14 2014 @ 03:51 AM
link   
I am not sure if any of you knew this or not - but Facebook uses social algorithms to ensure that most content you see is something that you agree with or want to see on their website, in essence, it keeps you isolated in your own bubble and gives you the illusion that the way you think is the way that the vast majority of everyone thinks.

I got this information personally from Facebook developers and marketers I have talked to over the past few years. There are hints of it online if anyone wishes to Google search to check it out - I encourage that. I was able to find a Wikipedia source!

I think this is highly important. A recent study reported by the A.P. shows that in the past few years, the amount of people agreeing completely with either one political philosophy or the other has doubled while the amount of people maintaining a centrist viewpoint or mixed viewpoint has sharply decreased.

In addition, people who are isolated in culture bubbles may be entirely unaware of different cultures even in their own country, especially America - where there is a gigantic chasm growing between conservatives and liberals. Someone might hear a piece of news and mistakenly think that the person involved is eccentric - when in fact, the person involved is 100% in line with their own culture.

I think that it is imperative that steps are taken to increase exposure by everyone to all kinds of different viewpoints in order to increase tolerance, acceptance, and especially negotiating skills. How is it possible to negotiate if both parties think the other party is insane?

At any rate, I am not sure if I am able to find any proof of this algorithm on the net, it is only known to me through my social network at the moment. However, I did find a reference to it in passing on a very interesting Wikipedia article linked below.
edit on 14amSat, 14 Jun 2014 04:06:29 -0500kbamkAmerica/Chicago by darkbake because: (no reason given)



posted on Jun, 14 2014 @ 04:01 AM
link   
a reply to: darkbake

Okay here it is on Wikipedia - a Filter Bubble.


A filter bubble is a result state in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behaviour and search history) and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Prime examples are Google's personalized search results and Facebook's personalized news stream.


Wikipedia: Filter Bubble

And here is why it can be dangerous.


The filter bubble concept is similar to a phenomenon in which people and organisations seek information which is initially perceived as relevant but which turns out in fact to be useless or only partially useful, and avoid information perceived as irrelevant but which turns out to be useful.


So it causes people's perceptions about reality to be wrong - and this means that they start making stupid decisions, in essence, and make incorrect judgement calls about what facts are relevant or not.
edit on 14amSat, 14 Jun 2014 04:04:59 -0500kbamkAmerica/Chicago by darkbake because: (no reason given)



posted on Jun, 14 2014 @ 04:06 AM
link   
I have heard this before Darkbake so if you find a link or not... The story is floating around out there somewhere.

It's not surprising as I find most adults create these little "agreement bubbles" around themselves IRL more often than not from what I have seen. It always amazes me how many people do not care to ever hear, see, or even discuss issues that they are unfamiliar with, do not like, or just plain don't like the sound of.

I grew up as "the devil's advocate", and still spend a lot of time in that role even now. In doing so, I have learned many things I would not have otherwise. There are only a few things in my life that I will not budge my stance on... there are many, many, more that I have changed my mind on, and others that I continually change my mind on even now at 37.

To only surround yourself with "Yes Men" makes folks like Justin Beiber IMO. Nobody ever challenges your views or your mind (mom always said if you don't use it, you'll lose it and I tend to agree). You miss out on learning so many different things from so many different people. I don't understand why folks make the choice to only be with those like themselves.

But many do. I would probably say that a majority do. I can only assume that it is a comfort thing for most. I like to learn, and I like to see how others view things... even if I don't agree. I like to know why they do the things they do, act the way they do, think the way they do, etc.

When you stop learning, you might as well just start pushing up daisies. Folks need to get out of their comfort zones and live a little.


edit on 6/14/2014 by Kangaruex4Ewe because: (no reason given)



posted on Jun, 14 2014 @ 04:09 AM
link   
a reply to: darkbake


How is it possible to negotiate if both parties think the other party is insane?

pmsl
it's not possible
(divide & conquer)

 

..ever seen the inner lining of zuckerberg's jacket?



posted on Jun, 14 2014 @ 04:10 AM
link   
Well what do you expect, FB wont engage users or make money by providing users with content they don't want to see. But I mean why would anyone really want FB to suggest things they don't like, no one really wants that, and so FB wont offer that. I don't really use FB so I can't really give an opinion on this, but I do often use YouTube and I know they do a very similar thing. YouTube often show me videos related to my subscriptions or videos I have watched recently, and it works well some of the time but quite frankly most of the videos they suggest to me are just a load of crap. In fact YouTube seem to do a decent job of showing me things I don't agree with or things I don't like.

For example if I just watched a video about some conspiracy they'll suggest a bunch of videos which purport to debunk the video I just watched. If I watch a video about some famous person I like, YouTube will suggest a bunch of videos where people are ranting about how much they don't like that famous person. I guess they don't have a solid way to determine which videos have a "negative" and "positive" spin on a particular topic, they just look for matching keywords. But honestly some times it does feel like propaganda, like they just want to crap over everything I like and everything I believe. But in some cases it has helped me to see alternative points of view, so I guess you have a semi-legitimate point.



posted on Jun, 14 2014 @ 04:19 AM
link   
I'm glad they do. The other side notoriously makes me sick.



posted on Jun, 14 2014 @ 04:23 AM
link   
Could this be part of the whole personalized advertising trend going on, a bit of a blend and mixed use in the build up of profiling data?

If this is the case that big brother is getting better at sifting and filtering the information that does reach us it does have its good and bad points. With so much information going around skipping past a lot that does not interest me makes better use I the time I have. How the information that does not interest me is defined is of concern. Just because something may question my beliefs, challenge my understanding or even make me upset or angry does not mean my life will be any better being censored.

For some people with a closed mind and anger issues is it better to just let sleeping dogs lie? Who, how and why behind this growing technology sounds just a complex as the technology itself.



posted on Jun, 14 2014 @ 05:23 AM
link   
a reply to: kwakakev


Who, how and why behind this growing technology sounds just a complex as the technology itself.


That is kind of what I am getting at, as a psychology major, I think of how this kind of technology could potentially lead us all into lives where our own thoughts are magnified until each of us is in a very specialized comfort zone - it could lead to interesting situations.

It could also increase social anxiety when dealing with people of unknown social background, things like that -
edit on 14amSat, 14 Jun 2014 05:23:58 -0500kbamkAmerica/Chicago by darkbake because: (no reason given)



posted on Jun, 14 2014 @ 05:26 AM
link   
a reply to: ChaoticOrder

I did expect Facebook to use that kind of algorithm, actually - it makes good business sense. I think there could be interesting implications as things like this are expanded on.

Chaotic, I think that is really interesting about your experiences on YouTube. I usually find my thoughts are amplified when I am on there, like some kind of exploration of my subconscious - I find it interesting that YouTube is trying to deprogram your views


Thanks for stopping in, Chaotic.



posted on Jun, 14 2014 @ 05:35 AM
link   
Its one more method of keeping the public under big brothers control .. cant have people realising theres a world out there and that there beloved leaders are feeding them a load of bovine fecal matter on how the world really is .. keep the people ignorant and apathetic easier to control them .. not surprised in the least farcebook is doing that .. would be surprised if they didnt do it .. after all its a tool designed to keep the population under control and monitored.



posted on Jun, 14 2014 @ 05:56 AM
link   
a reply to: darkbake




How is it possible to negotiate if both parties think the other party is insane?


Reminds me of several topics on ATS.

Also in many cases one party agree on one topic, while critical questions from the other party gets completely ignored or downplayed even though the first party is wrong.

It's the day of the internet or society as a whole, you get fooled into believe, unless you keep your objective view on both sides.



posted on Jun, 14 2014 @ 06:39 AM
link   

originally posted by: darkbake
I am not sure if any of you knew this or not - but Facebook uses social algorithms to ensure that most content you see is something that you agree with or want to see on their website, in essence, it keeps you isolated in your own bubble and gives you the illusion that the way you think is the way that the vast majority of everyone thinks.


They just helping users to see what they want to see.It's a commercial website.

Don't like use Facebook .



posted on Jun, 14 2014 @ 08:09 AM
link   
a reply to: darkbake

Information is a very powerful force, can help put people on the moon or start revolutions. It is very much a cornerstone of our ability to communicate and work together, or against each other as each case may be. In terms of state security the news editors held a very important position in filtering what information did make it into the general public arena, but since the rise of the internet this position is waning.

Due to the power of information there are strong desires to control and direct it, some types of information more so than others. How the official stance on 9/11 has held despite all the subsequent investigation and inquiry a testament and fear to the power of information distribution. How these trends of information focusing and direction are building does have a strong relationship with issues of privacy and security. Something not just for psychologists to consider but also lawyers, programmers, educators, media and corporate officials as well as society at large contends with the premiss: garbage in, garbage out.

Due to the resilience built into the foundations of the internet, all will not be lost as the mainstream providers hold the nice and safe official line. In time it is hard to say where things will end up. Use it or lose it is one approach that nature uses to sort stuff out.



posted on Jun, 17 2014 @ 06:01 AM
link   

originally posted by: Mianeye
a reply to: darkbake




How is it possible to negotiate if both parties think the other party is insane?


Reminds me of several topics on ATS.



It does to me, as well - I don't suppose this phenomenon is limited to Facebook; what I wondering in particular is if internet usage can actually make people less aware of other perspectives -

And then if this makes it harder for people to relate in real life, where everyone is likely to have a different perspective and be in close proximity.



new topics

top topics



 
8

log in

join