All,

Referencing this thread (
http://www.haskell.org/pipermail/haskell-cafe/2007-May/025769.html), I have
prepared a small update to robots.txt. It doesn't do much, but should
exclude all dynamic pages from being spidered. Please vet it and let me know
of any concerns. Note I'm not really interested in reconfiguring the wiki to
have nicer URLs - what is there is what I'm working with right now.

Here's the updated robots.txt (availabe as
http://www.haskell.org/robots2.txt right now):

 User-agent: *
 Disallow: /haskellwiki/Special:
 Disallow: /haskellwiki/Special%3A
 Disallow: /haskellwiki/?
 Disallow: /haskellwiki/%3f

Any feedback, let me know. Thanks!

Justin
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to