In article <[EMAIL PROTECTED]>,
 "David Waizer" <[EMAIL PROTECTED]> wrote:

> Hello..
> 
> I'm  looking for a script (perl, python, sh...)or program (such as wget) 
> that will help me get a list of ALL the links on a website.
> 
> For example ./magicscript.pl www.yahoo.com and outputs it to a file, it 
> would be kind of like a spidering software..

David,
In addition to others' suggestions about Beautiful Soup, you might also 
want to look at the HTMLData module:

http://oregonstate.edu/~barnesc/htmldata/

-- 
Philip
http://NikitaTheSpider.com/
Whole-site HTML validation, link checking and more
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to