It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Some features of ATS will be disabled while you continue to use an ad-blocker.
MoveOn wouldn't be possible without the open Internet. But according to our own Eli Pariser, the Internet itself is changing. His book on the topic, The Filter Bubble, comes out this week (you can check it out here).
In March, the TED conference invited him to preview the argument. When I talked to Eli beforehand, he was really nervous—in the audience were top executives from Google, Facebook, Microsoft, and a number of other companies he critiques. But his call for an open, ethical Internet—he actually called out the Google founders and Bill Gates in the audience by name—got a standing ovation. And it's been burning up the TED website ever since.
We're sharing it today because we think it's a really important point. Increasingly, the Internet is hiding things from us, and we don't even know it. Take a moment to watch Eli's TED talk today:
"You spend half your life in Internet space, but trust me—you don't understand how it works. Eli Pariser's book is a masterpiece of both investigation and interpretation; he exposes the way we're sent down particular information tunnels, and he explains how we might once again find ourselves in a broad public square of ideas. This couldn't be a more interesting book; it casts an illuminating light on so many of our daily encounters."
If you're interested in the book, you can check it out by clicking on the image to the left or this link—all of Eli's profits from this email will go to MoveOn.org Civic Action.
It's not a given that the Internet will remain fertile ground for democracy. We need to make sure it does, and Eli's argument is an important part of that fight.
Source and rest of article and book link: front.moveon.org...
It has recently become widely known that Google is censoring search results displayed to their Chinese language users (Google.cn) to conceal the existence of sites that the Chinese government considers objectionable.
It is much less widely known that the major search engines also censor access by their English language users to many web sites. They all have internal organizations responsible for executing the censoring policies of their company. We can be confident that none of these organizations is called the “Censoring Division”. The people in these organizations may also sincerely believe that what they are doing is in the best interests of their users and that every single site that they block is “doing something wrong”.
The problem with Internet censoring is the same as any other form of censoring. As history has repeatedly demonstrated, once you start censoring it is very hard to stop. It is always possible to rationalize that people would be better off if they didn't have access to certain information.
Search engines (especially Yahoo Search) talk about the idea that they are censoring access to improve the “quality” of the “user’s experience”. Just like the Chinese government, search engines say they are censoring to protect their users. However, the user is not given a choice in either case. There is no message on the search results page that says: “We have excluded results from sites that we consider to be of low quality or otherwise objectionable, click here to repeat the search with the censored results included.”
Nobody ever banned or burned a book that they thought other people should be allowed to read.
All the majors (Google, Yahoo Search, MSN Search) admit on "webmaster guidelines" pages to censoring access by their users to sites that employ “deceptive practices” to unfairly increase their search engine exposure. Because, like the Chinese government, search engines take the position that any site that they have deleted “has done something wrong” they prefer the terms “banning” or “penalization” to “censoring”. However, functionally, there is no difference. A site that has been censored (or “banned”, “penalized”, “blackballed”, "blacklisted", “de-listed”, or “removed from our index”) cannot be found no matter how relevant its pages are to a search. Censored sites are manually, on a site by site basis, removed from and barred from a search engine’s index. (Search engines can also use site-unique bias to suppress access to individual web sites.) None of the majors admit to any censoring or banning on pages likely to be seen by their users (people doing searches).
Note carefully that there is a difference between outright censoring, in which all (sometimes all but one) of a site’s pages are removed from the index, and a “rank” problem where it is only less likely that a site will be found. It is easy to determine if a site has been deleted. (See Is Your Site Banned?) It is much harder to detect even gross bias in a search engine’s ranking algorithm or depth-of-crawl algorithm.
Search engines do delete sites for using deceptive practices but they also de-list sites for “inconvenient practices” and may also remove sites for competitive reasons or other editorial reasons. Generally speaking, small sites (less than 100 pages hosted on a domain name) are not banned except for deceptive practices. It is also possible for any site to fail to appear on a search engine because of technical site configuration issues. A very small and insignificant site could conceivably be missed by a major search engine, especially if it had very few incoming links from other sites.
People pick a search engine based on the perceived comprehensiveness of search results (ability to find relevant pages), freshness (how often the search engine visits pages and updates its index to reflect new information), and quality of results (probability that a given result page is useful as opposed to "spam"). A poll conducted by Search Engine Honesty indicates that the last factor is the most important for 60 percent of users. It is therefore no surprise that search engines are trying hard to improve the average quality of their results by suppressing spam sites. They can easily hide suppression of competition and editorial bias in their anti-spam program.
However, nearly 60 percent of poll respondents said that search engines should provide the option of seeing censored results and 90 percent said that search users should at least be advised that some sites had been intentionally deleted.
Deceptive practices involve features of a web site designed to “trick” search engines. Such practices are designed to take advantage of weaknesses in a search engine's system in order to get an unfair advantage in search engine exposure. The following is a list of common deceptive practices:
-Using invisible text (same color as the background) to feed different text to the search engine from that seen by a viewer; using tiny type at the bottom of a page for the same purpose; “stuffing” keywords in “ALT” or “Keywords” tags (usually not seen by viewers); many other similar techniques.
-Programming a web server to detect when it has been accessed by a search engine’s spider and feeding the robot different information than would be received by a viewer ("cloaking").
-Use of multiple “doorway pages” that are each designed to be optimum for a particular search engine.
-Use of “excessive” cross-linking.
-“Link farms” that are for the express purpose of gaming “link popularity”; links in locations or on pages that would seldom or never be seen by human visitors.
- Buying links.
-Duplication of data such as hosting the same site on multiple domain names. (See The Redundancy Explosion.)
Deceptive practices are typically aimed at increasing the search results rank a site would have for particular keywords relative to a "legitimate" site on the same subject. This problem is made more difficult by the fact that search engines are reluctant to define "legitimate" in any detail. "Deceptive" is therefore a "gray area". A second goal may be to increase site traffic generally by using hidden keywords for popular but off-topic subjects. This could be useful if the site is displaying pay-by-impression advertising or advertising a subject of very general interest (e.g. CocaCola).
Search engines may be willing to describe the particular deceptive practice causing censoring if requested by a webmaster. In addition, Google is reported to be testing a system whereby webmasters of sites censored for a deceptive practice would be advised by means of email to email@example.com that their site has been censored and the reason for the action. All the major engines have procedures whereby webmasters that notice that their site has been censored, determine the nature of the deceptive practice, and fix it, can apply for reinstatement.
Our impression is that sanctions for deceptive practices are more or less fairly applied. A large-business, Fortune 500 website engaging in deceptive practices will likely be censored as well as a minor site. A major car manufacturer’s site was recently temporarily censored by Google, apparently for using doorway pages. Reinstatement (Google's term is "reinclusion") is likely to be much slower for a small-business site. Google is widely reported to have "punishment" or "timeout" periods preceding site reinstatement.
Inconvenient practices involve features of web sites which, while not deceptive, nonetheless represent a problem for a search engine. More specifically, the automated, software driven processing at the engine does not handle these features in a way that is satisfactory for the search engine’s management. It is easier to manually delete thousands of entire sites than fix the problems with the software. Notice that in this case the site isn’t “doing anything wrong”; the problem is actually at the search engine. If the search engine design were changed such that another feature became a problem, then sites having that feature would be censored. Sites that have been operating for five years or more have been suddenly banned by search engines. (See Case Studies.)
Censoring for inconvenient practices is much less fairly applied. Banning of large-business sites for inconvenient, competitive, or editorial reasons is rarely, if ever done. If Google censored Amazon for convenience, competitive, or editorial reasons there would be hell to pay. Suits would be filed, Congressional investigations would be held. PR campaigns would be executed. A small-business owner doesn’t have these advantages.
If your site has been censored for an inconvenient practice, it may be very difficult to determine which aspect of the site is causing the problem. Search engines are understandably very reluctant to disclose, especially in writing, that they have suppressed access to an entire site for their own convenience. They are even more reluctant to disclose that a site has been suppressed for criteria that are conspicuously not being applied to other sites.
Here are some potentially inconvenient practices and features:
-Large number of pages – Sites with a large number of pages may be a problem for some search engines. If the engine indexes the entire site, a large amount of search engine resources (disk space, bandwidth) could be consumed by a site that might not be very important. Normally, we would expect the depth-of-crawl algorithm to handle this by indexing a relatively smaller number of pages in sites receiving relatively less traffic or otherwise having less merit. There is increasing evidence that the major search engines do indeed ban small-business sites merely for having a large number of pages. It is also true that all medium and large sites have (by definition) a “large number of pages."
-Links. Sites that have a large number of outgoing links such as directories or sites with a large “links” page may tend to upset the link popularity scheme for some search engines. Sites that have forums, message boards, guestbooks, blogs, or other features that allow users to publish a link may also be seen as interfering with the link popularity concept. Google’s PageRank link popularity algorithm is less susceptible to these problems because it automatically penalizes pages for outgoing links while rewarding them for incoming links. Some site owners claim that their sites have been censored merely for having a links page.
Google says in a form email sent to web sites that inquire why they have been banned: “Certain actions such as buying or selling links to increase a site’s PageRank value or cloaking - writing text in such a way that it can be seen by search engines but not by users - can result in penalization.” The largest single buyer of links is probably Amazon, which has one of the most successful affiliate programs. Needless to say, Google has not “penalized” Amazon. Google reports (3/06) indexing 144 Million pages at Amazon.com! Yahoo sells links from their directory. Google reports indexing 233 Million pages at Yahoo.com. Google runs a copy of the Open Directory on its own site (directory,google.com). Google reports (3/06) they index 12.4 Million pages in their own directory.
Censoring for Competition Suppression and Editorial Reasons
Directories or other collections of links compete directly with search engines. Our case studies do in fact suggest that search engines censor small-business directory sites in order to suppress competition. Check the Case Studies and decide for yourself. Search engines also engage in many other business activities such as selling of things, provision of email, message board, video, photo, and mapping services, etc. As long as it is legal to do so, it is unreasonable to expect that they would not suppress larger competitive small-businesses in search results. Suppressing of other larger small-business sites that compete with a search engine or compete with a business partner of a search engine is also likely. Censoring a single small-business competitor would certainly have no effect on the bottom line of a major search engine. Censoring thousands of such sites would obviously have a beneficial effect.
There is currently no convincing evidence that any of the major search engines ban small sites (less than 100 pages) for editorial or competitive reasons. There are plenty of small "Google sucks" sites out there.
The National Security Argument
There is a National Security Argument that goes to the effect of: “We are punishing you but we can’t say why we are punishing you or give you an opportunity to defend yourself because doing so might disclose information that could be used by the enemy”. Search engine people use a version of this argument to justify their refusal to disclose why a particular site has been censored. The idea is that a spammer might have found and exploited a previously undisclosed weakness in a search engine. If the search engine discloses the reason the site has been banned, it might add some confirmation that the deceptive technique works. The spammer might spread the word or be more likely to use the technique on another site.
This argument may have had some validity ten years ago but is currently ridiculous. Search engines have been around for a long time (by Internet standards). Search technology is well developed. Abuse methods are now well known; you just read about most of them. A spammer that has implausibly found a new weakness has other ways to measure the effectiveness of its method.
A much more plausible explanation is that search engines want to conceal the fact that the site is being banned for a practice that others are being allowed to continue (buying links, duplication of data, directories, links pages, guestbooks, message boards, etc.) or that the site is being censored for competitive or arbitrary editorial reasons. Search engines are using the "national security" argument to conceal their own unfair practices.
Google has announced a plan to notify some webmasters of censored sites (presumably the ones that have been banned for deceptive practices) that their site has been censored and the reason for the action. The notification will be automatic and not at the request of the webmaster. Our understanding is that Google will generally continue to refuse to disclose the reason for censoring to webmasters that do request such notification. This allows Google to disclose that a site has been censored for a deceptive practice while continuing to deny that it is censoring other sites for reasons other than deceptive practices.