On Thu, May 03, 2007 at 10:25:45PM +0000, J. Meijer wrote: > > It seems that at least at some point in time google didn't > like duplicate urls at all. Don't know about the actuality though. > Put somehting in robots.txt to avoid having them indexed twice?
Robots.txt doesn't allow wildcarded urls, so there's not an easy way to make this happen. But, I set things up so that /Main/WikiSandbox returns a permanent redirect to /wiki/Main/WikiSandbox . Google doesn't have any issue with permanent redirects, since the page is officially appearing at only one url, as opposed to having the same content appearing at two different locations. Thanks! Pm _______________________________________________ pmwiki-users mailing list pmwiki-users@pmichaud.com http://www.pmichaud.com/mailman/listinfo/pmwiki-users