On Tue, Jul 15, 2014 at 4:16 AM, Brian Anderson <[email protected]> wrote:
> Can somebody file an issue described exactly what we should do and cc me?

Nothing. Absolutely nothing.

robots.txt rules do not apply to historical data; if archive.org has
archived something, the introduction of a new Disallow rule will not
remove the contents of a previous scan.

It therefore has three months in which to make a scan of a release
before that release is marked obsolete with the introduction of a
Disallow directive.

This is right and proper. Special casing a specific user agent is not
the right thing to do. The contents won’t be changing after the
release, anyway, so allowing archive.org to continue scanning it is a
complete waste of effort.
_______________________________________________
Rust-dev mailing list
[email protected]
https://mail.mozilla.org/listinfo/rust-dev

Reply via email to