Hi
On Thu, 14 Mar 2002 03:08:21 -0700, Sean M Burke (SMB) said
SMB> I'm a bit perplexed over whether the current Perl library
SMB> WWW::RobotRules implements a certain part of the Robots Exclusion
SMB> Standard correctly. So forgive me if this seems a simple
SMB> question, but my reading of th
> So:
>
> I was looking at a robots.txt file and it had a series of disallow
> instructions for various user agents, and then at the bottom was a full
> disallow:
[...]
> Wouldn't this just disallow everyone from everything?
No, it would disallow everyone but a ... d (with the
specified
> OK, then is there a way to create an internal wildcard?
>
> User-agent: *
> Disallow: /*/97
> Disallow: /*/98
No, not in the current specification (draft).
Regards, Martin
--
Sent through GMX FreeMail - http://www.gmx.net
> Jonathan Knoll:
> >> User-agent: *
> >> Disallow: /cgi-bin
> >> Disallow: /site
>
> Klaus Johannes Rusch:
> > /cgi-bin/test.cgi
> > /siteindex.html
> > would be excluded.
>
> But what about these paths (in the same root dir):
>
>/foo/cgi-bin/test.cgi
>/bar/user1/cgi-bin/test.sgi
>/ba