Folks,

  Let me first say what a pleasure it is to find this program.

  We are trying to set up a system where an offline network, perhaps
  in a school, can use the wwwoffle proxy.

  However, the school is *always* offline, with connectivity supplied
  via UUCP. The UUCP transport connects on a daily basis, picks up
  Email (easy ..) and WWW requests. For the curious, you can see
  why we use this arrangement at http://www.wizzy.org.za/ .

  Currently I tar up the outgoing directory, and pass this back to
  another (well-connected ..) server, also running wwwoffle. I untar
  the outgoing requests into the server tree, and then run 'wwwoffle -fetch'
  on that machine.

  I then run find on the /var/spool/wwwoffle/http tree, looking for new
  files, tar them up, and pass them back to the school LAN server via
  UUCP again.

  This works fine for a single school.

  However, we need it to run for multiple schools.

  Now the find has no relevance, as recently changed files may not be
  targetted for that school, or a newly requested file for one school
  may be lying in the cache, and thus be 'old'.

  Ideally, I would like to know from the outgoing/* requests which
  files in the http/* tree are associated with the requests, and just
  tar those up for transport back.

  A point in the right direction and I could put together a shell script
  or perl script to do what I need.

  Suggestions ?

Cheers,   Andy!

Reply via email to