It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Some features of ATS will be disabled while you continue to use an ad-blocker.
A blog owner can avoid liability for user-generated content that appears on his site without being checked or moderated, the High Court has ruled. But fixing the spelling or grammar in users' posts could lose him that protection, it said.
The E-commerce Directive exempts information society service providers, such as ISPs, web hosts and search engines, from liability for the information they store or pass on to users as long as they are not involved in its creation or editing and as long as they remove it quickly once notified that it breaks the law.
The case involved a blog post on Labourhome.org by John Gray, which claimed that local political activist Johanna Kaschke was arrested on suspicion of being a member of the Baader-Meinhof terrorist group.
The High Court assessed how far the exemptions for service providers go. It said that the fact that one area of a site is moderated does not prevent other areas of the same site from having exemption from liability.
"There is no reason in principle why the operation of a chat room should be incapable of falling within the definition of the provision of an information society service consisting of the storage of information," said Mr Justice Stadlen in his ruling.
"Thus in principle there is no reason why it should not be an activity intended to be protected by Article 14 of the E-Commerce Directive and eligible for the exclusion of liability conferred by [the law]," he said.
"It is not necessarily a bar to entitlement to the protection conferred by [the law] ... that the provider of an information society service consisting of storage of information is also engaged in an activity on the same website which is either not an information society service or if it is which does not consist of the storage of information."
Struan Robertson, a technology lawyer with Pinsent Masons, the law firm behind OUT-LAW.COM, said that the ruling serves as a reminder of the risks in moderating user-generated content.
"Many sites apply some form of moderation to all user contributions for reasons of quality control, whether that's before or after publication. This ruling just shows how dangerous that is and how narrow the safe harbour may be," he said.
"Even an attempt to filter for profanities or comment spam, if done manually, involves a risk for the publisher. If you want to be sure that you're not liable for what your users say, the judge is basically saying you need to ignore user contributions completely until you get a complaint.
"That's not a new principle," said Robertson, "but it's a warning to site owners about how to interpret it. Some owners may think they have less responsibility for user comments than they really do, and they may wrongly assume that a post-moderation policy is completely safe."