At 12:55 04/03/2002 +0200, Andy Rabagliati wrote:
>> I think that serializing the requests is the best way of doing it
>> I don't agree with your objection to it and it solves another problem.
>
>> I don't see how it can be any slower to serialize.
>
>If two schools are 'surfing' one of the sites may be slow, thus not
>using the full bandwidth available to the fetching machine ? We are not
>in the USA. The 'max-servers' parameter could be upped, though.

Another solution could be to use more than one proxy server running at the
same time. You can set a pool server that serves the requests for all the
others.
For each separate client/school you set a separate copy of wwwoffle (each
one with its own config file), on another computer or another pair of ports
(8080-8081). Configure it to use the pool server; in section "Proxy" of the
config copy :
 <http://*> proxy = URL-of-the-pool-proxy:8080

Feed these servers with their own outgoing requests, and once "Pool" is
online, issue a "wwwoffle -c config-file-for-this-client -fetch"

>It would be nice not pass the client wwwoffle copies of pages it already
>has. This requires keeping a great deal of state on the fetching machine -
>your suggestion being a complete duplicate cache. The bandwidth
>overhead of sending duplicate copies is less onerous, as it is our own
>802.11b transport.

If you keep the cache on the "fetching" proxies it should solve this
problem. But set the "original" and "fetching" proxies with the same purge
options or you could have problems with page/images not being stored anymore.


-- 
Marc


Reply via email to