Cool!

I'll probably use that if I do a next version of my filter, and if I want to
manipulate the content based on the content type, the host that is requesting
it, etc.  Right now, I'm just using it as the proxy server on my LAN, so that
only those who now a password can surf.  If they can't, the handler prints out
a login form so that they can login for 30 minutes or whatever.  Eventually,
I'll hook this into a database, so that websites can be inserted, categorized,
and rated.  I'm using IPC::Cache right now to store login data and everything
is fast.  The user doesn't really notice any performance hits, but I'm scared
of when I have 200 people surfing at once, and every request has to be
validated against a database.

On 20-Jul-2000 Alvar Freude wrote:
> Hi,
> 
>> If you find a way to do it with Apache::Proxy, let the list know.
> 
> I am sure it will work with the example given by Darren.
> 
> If i checked it I think I'll create a small module and can spread it.
> 
> 
>> One of the major reasons I went this route over something like the examples
>> in
>> the mod_perl book, was speed.  Downloading big files using the examples book
>> was slow, as apache first gathers the content up into a variable (where you
>> can
>> do your regular expressions or whatever manipulating), then sent it to the
>> browser.  You would need a lot of memory in this situation.
> 
> yes, but if you use a subroutine which handles the incoming chunks, you
> can pass the file emmediatly. See
> http://theoryx5.uwinnipeg.ca/CPAN/data/libwww-perl/lwpcook.html at the
> bottom :)
  

Regards,

Wim Kerkhoff, Software Engineer
NetMaster Networking Solutions
[EMAIL PROTECTED]

Reply via email to