https://bugzilla.wikimedia.org/show_bug.cgi?id=61132

--- Comment #3 from Tim Landscheidt <t...@tim-landscheidt.de> ---
(In reply to comment #2)
> Why would the first be a WONTFIX?

Because there are tools that are linked from every wiki page and any spider
accessing them brings the house down.  As tools are created and updated without
any review by admins and wiki edits are not monitored as well, blacklisting
them after the meltdown doesn't work.

So unlimited spider access is not possible.

> For the second see the docs,

Unfortunately, there is no specification for robots.txt; that's the core of the
problem.

> Allow: /$

> is supposed to work (at least with Google).

According to [[de:Robots Exclusion Standard]] with Googlebot, Yahoo! Slurp and
msnbot.  And the other spiders?  Will they read it in the same way or as "/"? 
How do we whitelist "/?Rules"?

-- 
You are receiving this mail because:
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to