
So if we can ban bots from the page histories or turn them off for the bot user agents or something then we might have a cure. Perhaps we just need to upgrade our media wiki software or find out how other sites using this software deal with the same issue of bots reading page histories.
What about adding the "nofollow" flag in the meta tags of the history pages?
Sounds like a good idea. If someone can do that then great.
Apparently other people have used URL rewriting to keep robots of subsets of the wiki pages: http://www.gustavus.edu/gts/webservices/2006/08/14/robots-in-the-wiki/ http://codex.gallery2.org/Gallery2:How_to_keep_robots_off_CPU_intensive_ pages Is that an option for our installation? Alistair ***************************************************************** Confidentiality Note: The information contained in this message, and any attachments, may contain confidential and/or privileged material. It is intended solely for the person(s) or entity to which it is addressed. Any review, retransmission, dissemination, or taking of any action in reliance upon this information by persons or entities other than the intended recipient(s) is prohibited. If you received this in error, please contact the sender and delete the material from any computer. *****************************************************************