> From: "Dan Muey" <[EMAIL PROTECTED]>
> > All I hear are crickets on the list anybody there today?
> 
> Hey I aint no cricket ;-)

I wasn't getting any messages from the list. I think it had to do with the black out.
Thanks for the input!

>  
> > I'd like to have a simple spider that will look at a url's 
> diretcory 
> > and simply give me a list of files in that directory.
> > 
> > IE
> > 
> >  my $files  = ????? http://www.monkey.com/bannana/
> > 
> > And have $files be an array reference or something so I could then :
> > 
> >  for(@{$files}) { print "-$_-\n"; }
> > 
> > Or somehtign like that.
> 
> As Wiggins says it may be impossible. These days you too often see 
> "Directory Listing Denied" :-(
> 
> If you can list the directory you can
>       1) download the http://www.monkey.com/bannana/ page with LWP
>       2) extract the links from it with HTML::LinkExtor (use the base 
> parameter when creating the object so that you get absolute URLs)
>       3) fetch all URLs that start with http://www.monkey.com/bannana/
>               it may be helpfull to allow the user of the 
> script (yourself) to
>               specify some filtering. Eg. that you want only 
> .gif and .jpg.
> 
> 
> I made for myself a little different script. I give it a URL like
>       http://www.monkey.com/bannana/pix001.jpg
> and it tries to download pix001.jpg, pix002.jpg, ...
> 
> Some sites insist on proper HTTP_REFERER so I can set that as well. I 
> don't think the script is worth posting here ;-)
> 
> Jenda
> ===== [EMAIL PROTECTED] === http://Jenda.Krynicky.cz =====
> When it comes to wine, women and song, wizards are allowed 
> to get drunk and croon as much as they like.
>       -- Terry Pratchett in Sourcery
> 
> 
> -- 
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> 
> 

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to