linklint can do what you are asking for. Remotely it can only see the pages that are linked together. To see the orphan'ed files you can run linklint locally.

Nathan
On Oct 27, 2003, at 9:30 PM, Dinh Nguyen wrote:

Hi all,

I know that this is probably not a right place to ask the question,
but I am sure that there are some experts on this forum knows the
answer.  Here is the question:

I am about the write a program to list (and count) all of the pages
(URL) in a directory from a website.
Let say, given a URL below:
http://www.xyz.com/myfolder
or http://www.xyz.com
there are lots of directories, each directory has number of pages.
So in this case, let say there are five pages located in myfolder
directory.  I'd like to find out those five URLs or links, for
example:
http://www.xyz.com/myfolder/page1.html
http://www.xyz.com/myfolder/design.doc
http://www.xyz.com/myfolder/faqs.html
http://www.xyz.com/myfolder/links.htm
http://www.xyz.com/myfolder/mynews.html

How would I do this? Can you please guide me step by step (or some
ideas) to design this program?

Thanks for your help.
Dinh Nguyen


--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to