I'm interested in this as a layperson - I don't code HTML, but say I wanted to put up a website and keep it's contents from the prying eyes of all
search engines, like RATS does - How does RATS do this and or - how would I or anyone else do this?
Malicious bots or email search bots (or any bot), for example, can ignore the commands above and browse the pages anyway if they are so-programmed to
do.
Hope that answered your question.
edit on 3-8-2014 by _BoneZ_ because: (no reason given)
To keep it from prying eyes you'll need to password protect it as all the robots.txt etc stuff is more of a gentleman's agreement not actual rules,
you can try and block them via various methods but if they want to get access they will do.
and if you really don't want it on the net you shouldn't have it anyways near the net in the first place
You can also set up Apache to block traffic with no useragent--which is common for bots. The techniques mentioned by _Bonez_ are more common, but I'd
say ATS has their own secret special way we dont know about
ETA: redundant. Nevermind.
edit on 4-8-2014 by RifRAAF because: (no reason given)
edit on 4-8-2014 by RifRAAF because: (no
reason given)