
22 May
2007
22 May
'07
10:33 a.m.
So if we can ban bots from the page histories or turn them off for the bot user agents or something then we might have a cure. Perhaps we just need to upgrade our media wiki software or find out how other sites using this software deal with the same issue of bots reading page histories.
The wiki could be configured to use /haskellwiki/index.php?.. urls for diffs (I believe this can be done by changing $wgScript). Then robots.txt could be changed to
Disallow: /haskellwiki/index.php Which bans robots from everything except normal pages.
Twan