It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Wikipedia to Color Code Untrustworthy Text

page: 1
9

log in

join
share:

posted on Sep, 1 2009 @ 11:13 PM
link   

Wired Science News for Your Neurons Wikipedia to Color Code Untrustworthy Text
By Hadley Leggett August 30, 2009 | 8:00 pm | Categories: Tech

Starting this fall, you’ll have a new reason to trust the information you find on Wikipedia: An optional feature called “WikiTrust” will color code every word of the encyclopedia based on the reliability of its author and the length of time it has persisted on the page.

More than 60 million people visit the free, open-access encyclopedia each month, searching for knowledge on 12 million pages in 260 languages. But despite its popularity, Wikipedia has long suffered criticism from those who say it’s not reliable. Because anyone with an internet connection can contribute, the site is subject to vandalism, bias and misinformation. And edits are anonymous, so there’s no easy way to separate credible information from fake content created by vandals.

Now, researchers from the Wiki Lab at the University of California, Santa Cruz have created a system to help users know when to trust Wikipedia—and when to reach for that dusty Encyclopedia Britannica on the shelf. Called WikiTrust, the program assigns a color code to newly edited text using an algorithm that calculates author reputation from the lifespan of their past contributions. It’s based on a simple concept: The longer information persists on the page, the more accurate it’s likely to be.

Text from questionable sources starts out with a bright orange background, while text from trusted authors gets a lighter shade. As more people view and edit the new text, it gradually gains more “trust” and turns from orange to white.


www.wired.com...

Not sure excatly what forum board this topic belongs in. I think that it is a rather interesting idea. One of the issues with wikipedia is the abillity for the public to edit, and this may help to distinguish untrustworthy information!



posted on Sep, 2 2009 @ 12:23 AM
link   

Originally posted by Scooby Doo

Text from questionable sources starts out with a bright orange background, while text from trusted authors gets a lighter shade. As more people view and edit the new text, it gradually gains more “trust” and turns from orange to white.


This is interesting. I am sure there will be watch dogs to make sure this algorithm is being implemented in the way that they say. It would seem fishy if all corporate and government agencies had all trustworthy colors from the initial contribution.

The thing is, Wikipedia is not exactly a great tool for propaganda by whomever. But if they found a way to make propaganda trustworthy in the readers eye, propagandists just might give Wikipedia a second glance.



posted on Sep, 5 2009 @ 01:39 PM
link   
Bump
, what is this being suppressed? Or does nobody care.! I know timing is everything, but hello?

[edit on 5-9-2009 by wiredamerican]



posted on Sep, 5 2009 @ 02:09 PM
link   
It's really more about reliability of the text rather than reliability of the author. You can see an example at the wikitrust website:

Main Page (with wikitrust-color enabled)

The differences in the color show you how often text has changed.

Some people want to use such reliability information as reputation for/against contributors. It could be gamed pretty easily if such a direct approach is taken.

[edit on 5-9-2009 by dzonatas]




top topics
 
9

log in

join