Andy Rabagliati <[EMAIL PROTECTED]> writes:

> > Well, from what I remember of sitescooper, wouldn't it require yet
> > another interpreter?
> It needs perl. It leverages off plucker code to generate the DB -
> though historically plucker had a perl transcoder too.

Well, we all make mistakes!  ;-)  Seriously, it would mean non-unix
users having to download both of them, wouldn't it?  I'm assuming
they're not common yet.  In that case, I think it would be a bad
thing.

> -fetch is wget, sitescooper, any number of others.

Now, using wget is an interesting idea, although it might require
quite a bit of scratch disk space to get a site.

> -convert we do need. The intermediate form should be a file (files ?)
> with links relativised.

This is interesting.  If it was just encoding all files it found, it
would open the possibility of including multiple sites in one plucker
db.  I'm not sure if such a thing is possible or desirable, though.

-- 
MJR

Do you need advice about the Internet or particular net services?  Why
not talk to my employers?  See http://www.luminas.co.uk/ for details.

Reply via email to