We've played around with robots.txt, but it's still useful for old docs
to be indexed (e.g., for removed features), which just need to figure
out how to get them deprecation in results. I wonder if <link
ref="canonical"> in the old docs would help.

On Sat, Dec 19, 2015, at 11:02, A.M. Kuchling wrote:
> On Sat, Dec 19, 2015 at 08:55:26PM +1000, Nick Coghlan wrote:
> > Even once the new docs are in place, getting them to the top of search
> > of results ahead of archived material that may be years out of date is
> > likely to still be a challenge - for example, even considering just
> > the legacy distutils docs, the "3.1" and "2" docs appear ...
> 
> We probably need to update https://docs.python.org/robots.txt, which
> currently contains:
> 
> # Prevent development and old documentation from showing up in search
> results.
> User-agent: *
> # Disallow: /dev
> Disallow: /release
> 
> The intent was to allow the latest version of the docs to be crawled.
> Unfortunately, with the current hierarchy we'd have to disallow each
> version, e.g.
> 
> Disallow: /2.6/*
> Disallow: /3.0/*
> Disallow: /3.1/*
> 
> And we'd need to update it for each new major release.
> 
> --amk
> _______________________________________________
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/benjamin%40python.org
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to