Hi,
You need to specify a parent cache
See: http://squid.visolve.com/squid/squid24s1/neighbour.htm#cache_peer
Bart
[EMAIL PROTECTED] wrote:
Hi
i have this situation
a browser which connects to an antivirus (hbedv) proxy
which connects to squid
then squid connect to internet
but i like browsers to
Hello Jeff,
On the squid page you have a 'scripts' section, and on that paga you
will find a tool to clean URL's from the Squid cache.
('third party software')
Bart
Jeff Donovan wrote:
greetings
I am having an issue where some browsers can access a site and others
can't. All browsers can access
Hi,
I have recently installed squid 2.5.STABLE6-20040913 with digest
authentication on a Solaris 9 box. Unfortunately, some browsers do not
support digest authentication but the fallback to basic authentication
does not seem to work. This is what I have in the configuration file:
auth_param dig
On Thu, 30 Sep 2004, [ISO-8859-1] Hendrik Voigtländer wrote:
how about caching-only bind sitting on the same machines as the
squid-daemons? resolv.conf points to localhost in this setup.
Deploying a local caching nameserver is mentioned in several performance
howtos (after setting up the ipcache
Henrik Nordstrom wrote:
On Thu, 30 Sep 2004, John wrote:
I have two squid proxies running on IBM X345 series servers under Red Hat
Enterprise Linux ES Version 3.0. My colleagues manage two QIP DNS servers
and since we introduced the squid proxies the DNS servers seem to be
taking
a lot more DNS qu
On Thu, 30 Sep 2004, Lewars, Mitchell (EM, PTL) wrote:
./configure --prefix= --prefix=/opt/squid --enable-snmp --enable-smartfilter
--enable-useragent-log --enable-cache-digests
make
make install
cd /opt/squid/bin/
[EMAIL PROTECTED] bin]# ./RunCache
Don't use the RunCache script.
sbin/squid -k p
On Thu, 30 Sep 2004, Christian Ricardo dos Santos wrote:
Sorry but I am a begginer, I never ever heard about this dstdomain.
dstdomain is a domain match to the host component of the requested URL.
url_regex is a regex pattern match to the complete requested URL.
Squid FAQ Chapter 10 Access Controls
On Thu, 30 Sep 2004, Christian Ricardo dos Santos wrote:
Now if any user type one of the three strings bellow the access to this =
blocked website is grant (anyways you can only read the text of it =
through, all the other links are broken).
www.uol.com.br/telefutura.com.br
www.uol.com.br/?telefutu
On Thu, 30 Sep 2004, John wrote:
I have two squid proxies running on IBM X345 series servers under Red Hat
Enterprise Linux ES Version 3.0. My colleagues manage two QIP DNS servers
and since we introduced the squid proxies the DNS servers seem to be taking
a lot more DNS queries tan before. When I
> Sorry but I am a begginer, I never ever heard about this dstdomain.
>
> How about this "# sed 's/^/\^/' txtgeneral.txt" ? I should use it instead
> "http_access allow txtlan general !download" ?
No no :)
sed is basically a utility used to manipulate text.
Running this (in a shell):
sed 's/^/\^
Sorry but I am a begginer, I never ever heard about this dstdomain.
How about this "# sed 's/^/\^/' txtgeneral.txt" ? I should use it instead
"http_access allow txtlan general !download" ?
- Original Message -
From: "Andreas Pettersson" <[EMAIL PROTECTED]>
To: "Christian Ricardo dos Sant
Hi
I have two squid proxies running on IBM X345 series servers under Red Hat
Enterprise Linux ES Version 3.0. My colleagues manage two QIP DNS servers
and since we introduced the squid proxies the DNS servers seem to be taking
a lot more DNS queries tan before. When I look at the squid logs, it se
> Thanks your for the help...
>
> But this way I need to create one rule for each website permitted to each
> group of users instead of a single rule by group.
>
> Nowadays we are using rules like the one bellow (two text files where one
> list the IP address and the other the websites). Right no
Thanks your for the help...
But this way I need to create one rule for each website permitted to each
group of users instead of a single rule by group.
Nowadays we are using rules like the one bellow (two text files where one
list the IP address and the other the websites). Right now we have 36 g
./configure --prefix= --prefix=/opt/squid --enable-snmp --enable-smartfilter
--enable-useragent-log --enable-cache-digests
make
make install
cd /opt/squid/bin/
[EMAIL PROTECTED] bin]# ./RunCache
Running: squid -sY >> /opt/squid/var/squid.out 2>&1
./RunCache: line 35: 14908 Aborted
Hello,
You could try this ... now this is my first attempt at a regular
expression.
acl badurl url_regex -i
.*\.(com|net|biz|org|ca|us|br).*\/.*\.(com|net|biz|org|ca|us|br)
http_access deny badurl
This pattern should match any url string that contains
*.domainsuffix.*/*.domainsuffix
So the r
> Everybody can access the site www.telefutura.com.br, but nobody can =
> access the website www.uol.com.br.
>
> Now if any user type one of the three strings bellow the access to this =
> blocked website is grant (anyways you can only read the text of it =
> through, all the other links are broke
On Thu, 30 Sep 2004 15:32:42 -0300
"Christian Ricardo dos Santos" <[EMAIL PROTECTED]> wrote:
> Hello,
>
> I REALLY need a help here.
>
> Nowadays we are using ACL system to avoid users access to some
> websites.
>
> Those users can only access a limited list of sites (around 30), any =
> place
We use squidguard and squid and that works well.
But unfortunately clever users found "workarounds".
Example: We successfully block www.xyz.com. But if someone
finds out the IP
of this website (nslookup) and enters this ip the website is
displayed.
Is there a way to block www.xyz.com AND the as
Thanks Adam, Here is my access.log
1096568145.782474 121.51.1.207 TCP_MISS/200 65602 GET
http://my.app.net/myapp-bin/myapp? - DIRECT/10.254.2.239 text/html
1096568147.141 94 121.51.1.207 TCP_MISS/404 1282 GET
http://my.app.net/favicon.ico - DIRECT/10.254.2.239 text/html
1096568147.2951
Hello,
I REALLY need a help here.
Nowadays we are using ACL system to avoid users access to some websites.
Those users can only access a limited list of sites (around 30), any =
place outside this list is blocked. I don't know how or when, but =
somebody discovery a way to cheat those ACLs.
Her
greetings
I am having an issue where some browsers can access a site and others
can't. All browsers can access the site outside squid.
how can i force squid to refresh it's cache? or just refresh the
contents for this url?
TIA
--j
---
jeff donovan
basd network op
>
> I like this idea :)
> However, blocking all jpegs wouldn't be the best thing to do..
> One way to solve this is to write a redirector that redirects
> all http requests with an url that ends with .jpg to a local
> cgi script. The scripts fetches the jpg-file, verifies that
> it is harmless
On Thu, 30 Sep 2004 08:36:29 -0700 (PDT)
John Davis <[EMAIL PROTECTED]> wrote:
> Our squid blocks access to everything but approved
> websites. Most of these sites have a domain, (ex:
> www.yahoo.com ) but some places insist on using IP
> addresses instead of domains for the URL, paticularly
> ce
On Thu, 30 Sep 2004 14:26:29 +0200
Boniforti Flavio <[EMAIL PROTECTED]> wrote:
>
> Elsen Marc wrote:
>
> > That's a bit circlonized reasoning in the sense that initially
> > you reported a possible problem, concerning a no caching
> > situation for all objects.
> > For 2.5.stable6 this was a poss
- Ursprungligt meddelande -
Från: "Eric Geater 9/01/04" <[EMAIL PROTECTED]>
> A discussion in another group handed a suggestion that Squid could be
> told to block MIME types in HTTP responses, which means that Squid could
> be called in to help with certain problems associated with the GD
On Thu, 30 Sep 2004, John Davis wrote:
How can I add individual IP addresses to the
whitelist like it was a domain?
By allowing it before where you deny.
Regards
Henrik
On Thu, 30 Sep 2004, Victor Medina wrote:
I am reading about coss file system right now! =)
coss is highly experimental and not exacly production quality..
regards
Henrik
On Thu, 30 Sep 2004, Eric Geater 9/01/04 wrote:
A discussion in another group handed a suggestion that Squid could be
told to block MIME types in HTTP responses
yes, via the rep_mime_type acl in http_reply_access.
Regards
Henrik
On Thu, 30 Sep 2004, Michael Puckett wrote:
then why not simply use the pread() function instead which seems to be a
direct replacement for the POSIX AIO and would halve the number of system
calls made when doing a read?
Mainly because pread does not exists on all systems (added in Unix98).
Regards
Eric Geater 9/01/04 wrote:
> A discussion in another group handed a suggestion that Squid could be
> told to block MIME types in HTTP responses
See the rep_mime_type acl in the default squid.conf.
> which means that Squid could be called in to help with certain problems
> associated with the GDI
Kvetch wrote:
> Hello, I am trying to get squid setup as a reverse proxy. I have
> squid on a seperate server than my Apache server. I was able to get
> squid to deliver pages but if I view the access log while trying to
> hit pages, everything gets reported as a MISS and I don't see any
> HITs.
Boniforti Flavio wrote:
> Elsen Marc wrote:
>
>> For 2.5.stable6 this was a possible cause, to the idea is :
>> 'take the patch' for that version and if that problem is seen.
>> Then test again.
>
> You're right, man.
> The fact is, I guess I can't apply that patch, because I'm working with
> a
Has anyone determined which Linux 2.6.x io scheduler (anticipatory,
deadline, cfq) works best with Squid?
I am reading about coss file system right now! =)
OK, i can try both options, i can recompile with coss and aufs/diskd. My
squid haven't had a crash in 4 months or so... So rebuilding the cache
in case of a crash is not an issue.
One thing is, how do i know how many request/seg i am serving at a
Our squid blocks access to everything but approved
websites. Most of these sites have a domain, (ex:
www.yahoo.com ) but some places insist on using IP
addresses instead of domains for the URL, paticularly
certain government sites. Every time we get a request
for a new ip-addressed site to be adde
On Thu, 30 Sep 2004 [EMAIL PROTECTED] wrote:
Hi i wish to know how can i block some url like private.com or playboy.com by
url address and also by some keyword teen,sex,bottom,etc.
domains is blocked by dstdomain acls.
domain patterns by dstdomain_regex, but be careful to not block too much..
Also
On Thu, 30 Sep 2004, Victor Medina wrote:
I'll have to recompile, but i am willing to =) First.. one thing, is
COSS fs productionr eady? If it is not, what about stability? =)
What about aufs?? I am currently using ufs.
If you have more than 30 req/s then looking into aufs/diskd is
recommended. Bu
On Thu, 30 Sep 2004 [EMAIL PROTECTED] wrote:
We use squidguard and squid and that works well.
But unfortunately clever users found "workarounds".
Example: We successfully block www.xyz.com. But if someone finds out the IP
of this website (nslookup) and enters this ip the website is displayed.
Is th
On Thu, 30 Sep 2004, Costas Zacharopoulos wrote:
Is it possible to have any password encrypted authentication scheme with
squid, without having a passwd file on disk?
ntlm
Can I mix digest authentication with an external helper program?
Please elaborate on what kind of external helper.. there is m
On Thu, 30 Sep 2004, Ampugnani, Fernando wrote:
acl porn url_regex "/usr/local/squid/etc/porn_domains"
acl aggresive url_regex "/usr/local/squid/etc/aggrsive_domains"
acl violence url_regex "/usr/local/squid/etc/violence_domains"
acl audiovideo url_regex "/usr/local/squid/etc/audiovideo_domains
A discussion in another group handed a suggestion that Squid could be
told to block MIME types in HTTP responses, which means that Squid could
be called in to help with certain problems associated with the GDIplus
vulnerability in Microsoft products. We use Squid as our proxy for our
entire networ
On Thu, 30 Sep 2004, Boniforti Flavio wrote:
A lot changed. There is now status code 200, not 304, and the reply size is
differnt (bigger).
What do these facts mean?
That it was no longer cached in your browser and a full object was
delivered. Before there was only a small "Not changed" indicatio
Henrik Nordstrom wrote:
> On Wed, 29 Sep 2004, Michael Puckett wrote:
>
> > I have been examining both the aufs and ufs versions of squid with truss
> > and have seen that the async i/o version has thousands more calls to
> > lseek than the non async version. On looking at the source of
> > squida
[EMAIL PROTECTED] wrote:
Hi i wish to know how can i block some url like private.com or playboy.com by
url address and also by some keyword teen,sex,bottom,etc.
Also i wish to block .mp3 and .mpeg, .mov,.avi etc.
Anyone can help me with this?
Check http://j-chkmail.ensmp.fr/surbl/squid-surbl-it
It'
Hello!
Further investigations on the performance issues.. i found this.
I took two access.logs of two different weeks. and counted how many
tcp-hits and how many tcp-misses I had.
In group one, one week worth of logs I found this:
10855 TCP_HITS that's almost 8%
108127 TCP_MISS that's almost 88
Hello!
I'll have to recompile, but i am willing to =) First.. one thing, is
COSS fs productionr eady? If it is not, what about stability? =)
What about aufs?? I am currently using ufs.
Victor.
On Thu, 2004-09-30 at 09:59, Costas Zacharopoulos wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash
Hi i wish to know how can i block some url like private.com or playboy.com by
url address and also by some keyword teen,sex,bottom,etc.
Also i wish to block .mp3 and .mpeg, .mov,.avi etc.
Anyone can help me with this?
Thanks in advance.
> We use squidguard and squid and that works well.
>
> But unfortunately clever users found "workarounds".
>
> Example: We successfully block www.xyz.com. But if someone
> finds out the IP
> of this website (nslookup) and enters this ip the website is
> displayed.
>
> Is there a way to bloc
jejeje! quite right! but basically a repeat a SuSE setup. Which is
fairly similar to this one.
Any suggestion based on my config for a leaner squid? =)
On Thu, 2004-09-30 at 09:52, Henrik Nordstrom wrote:
> On Thu, 30 Sep 2004, Victor Medina wrote:
>
> > SQUID was compiled as:
> > Squid Cache: V
Henrik:
I send that you request.
acl porn url_regex "/usr/local/squid/etc/porn_domains"
acl aggresive url_regex "/usr/local/squid/etc/aggrsive_domains"
acl violence url_regex "/usr/local/squid/etc/violence_domains"
acl audiovideo url_regex "/usr/local/squid/etc/audiovideo_domains"
acl
On Thu, 30 Sep 2004, Victor Medina wrote:
SQUID was compiled as:
Squid Cache: Version 2.5.STABLE5
configure options: --prefix=/opt/EPAWebCachingSuite-1.0-i686/
--sysconfdir=/opt/EPAWebCachingSuite-1.0-i686/etc/squid --with-dl
--enable-snmp --enable-carp --enable-useragent-log '--enable-auth=basic
We use squidguard and squid and that works well.
But unfortunately clever users found "workarounds".
Example: We successfully block www.xyz.com. But if someone finds out the IP
of this website (nslookup) and enters this ip the website is displayed.
Is there a way to block www.xyz.com AND the ass
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Is it possible to have any password encrypted authentication scheme with
squid, without having a passwd file on disk?
Can I mix digest authentication with an external helper program?
How could I combine a helper program with digest?
-BEGIN PGP S
Henrik Nordstrom wrote:
Disagreement on time could be one reason.
mmhhh... May I check this one, too?
enable log_mime_hdrs and there will be more hints. Be warned that this
also logs the authentication credentials so don't post log information
while logging in with a sensitive account..
OK, I'll
On Thu, 30 Sep 2004, Ampugnani, Fernando wrote:
It is possible when I define about 5 external acl to deny about
2 porn domains squid process take all cpu (90%) every time?.
How have you defined the acls?
Regards
Henrik
Hello all!
I have a SQUID server running on a Piii 1100ghz server with 1024 RAM,
SCSI disks. We have almost 100 users. And a 700Kbps ADSL Connection. The
cache directory is in it's own scsi drive.
I am seeing some very slow response in the proxy server. The incoming
traffic in the internet conne
On Thu, 30 Sep 2004, Boniforti Flavio wrote:
What could prevent my proxy to cache, done from my parent proxy?
Disagreement on time could be one reason.
enable log_mime_hdrs and there will be more hints. Be warned that this
also logs the authentication credentials so don't post log information
whi
Elsen Marc wrote:
Then I would advise to go more 'native' and
fetch squid from squid-cache.org.
Configuring-making and installing Squid is not so difficult.
Oh, I know that's not difficult, but I am used to do everything via APT :-)
--
---
Boniforti Flavio
Provin
>...
> You're right, man.
> The fact is, I guess I can't apply that patch, because I'm
> working with
> a pre-compiled Debian package. :(
Then I would advise to go more 'native' and
fetch squid from squid-cache.org.
Configuring-making and installing Squid is not so difficult.
M.
Hi all,
It is possible when I define about 5 external acl to deny about
2 porn domains squid process take all cpu (90%) every time?.
Are this decrease the squid performance?
Are there some suggestions for this?
Thanks in advance.
Fernando Ampugnani
EDS Argentina - Software, Storage
Elsen Marc wrote:
That's a bit circlonized reasoning in the sense that initially
you reported a possible problem, concerning a no caching
situation for all objects.
For 2.5.stable6 this was a possible cause, to the idea is :
'take the patch' for that version and if that problem is seen.
Then test
Henrik Nordstrom wrote:
if you run squid-2.5.STABLE6 with the "ufs" cache_dir type then you need
the patch.
If any other version or another cache_dir type then not.
mmhhh.. I guess I can't apply that patch, because I installed squid from
www.backports.org Debian Archive...
Should have been cac
On Thu, 30 Sep 2004, Boniforti Flavio wrote:
http://www.squid-cache.org/Versions/v2/2.5/bugs/#squid-2.5.STABLE6-ufs_no_valid_dir
How may I check wheter or not I'm in need of that patch?
if you run squid-2.5.STABLE6 with the "ufs" cache_dir type then you need
the patch.
If any other version or
On Wed, 29 Sep 2004, Michael Puckett wrote:
I have been examining both the aufs and ufs versions of squid with truss
and have seen that the async i/o version has thousands more calls to
lseek than the non async version. On looking at the source of
squidaio_do_read() it does indeed do a lseek() foll
>
> Elsen Marc wrote:
>
> > For stable6 make sure you are adequately patched if the
> > ufs store type is being used as Hendrik suggested :
> >
> >
> http://www.squid-cache.org/Versions/v2/2.5/bugs/#squid-2.5.STA
BLE6-ufs_no_valid_dir
>
> How may I check wheter or not I'm in need of tha
On Thu, 30 Sep 2004, Costas Zacharopoulos wrote:
I have a linux box, and I want to use NTLM authentication to authenticate
users to a remote samba server in the LAN.
Can I use winbindd to do it?
Yes, if this Samba is a domain controller.
Squid FAQ 23.5 How do I use the Winbind authenticators?
http:
Rick Whitley wrote:
I have a user that is using GAIM to talk with yahoo messenger. We have
configured the proxy but when she tries to connect she gets the error:
'GAIM Error: Access denied proxy server forbids port 5050 tunnelling'.
If I look at the conf file I can't see where that is being blocke
Henrik Nordstrom wrote:
squid.conf is a good start. Looking in cache.log for errors is also
good. And to make sure you are not bitten by the 2.5.STABLE6 bug
mentioned before.
The default of Squid is to have caching enabled, but certain directives
can be used to disable caching such as the no_c
On Thu, 30 Sep 2004, Costas Zacharopoulos wrote:
Is it possible to have squid encrypted authentication using as authorization
service samba or ldap?
With Samba and MSIE browsers you can use the NTLM authentication scheme.
This never transmits the password over the wire (not even encrypted).
Or if
Elsen Marc wrote:
For stable6 make sure you are adequately patched if the
ufs store type is being used as Hendrik suggested :
http://www.squid-cache.org/Versions/v2/2.5/bugs/#squid-2.5.STABLE6-ufs_no_valid_dir
How may I check wheter or not I'm in need of that patch?
Access the object from 2 di
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
I have a linux box, and I want to use NTLM authentication to authenticate
users to a remote samba server in the LAN.
Can I use winbindd to do it?
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.2.3 (GNU/Linux)
iD8DBQFBW+6Km87SXUGUjPsRAjnRAKCHfkBSPLZ
>
> I am had the same issue some times back, we were using
> a DSL connection, and when we try to use yahoo mail
> via http, it gives use the same error, 999, but if
> used via https, no error, i guess it is the
> browser---> proxy>proxy stuff which confuses
> yahoo.
> where is in many ca
>
>
> thx the fact is, the antivirus proxy scan http on a TCP port, FTP on
> another and don t analyse ssl
>
> so i have to find the config which says:
>
> cachepeer http to 127.0.0.1:8090
> cachepeer ftp to 127.0.0.1:8091
> cachepeer ssl direct_to_internet
>
(btw these syntaxes are invali
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Is it possible to have squid encrypted authentication using as authorization
service samba or ldap?
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.2.3 (GNU/Linux)
iD8DBQFBW92wm87SXUGUjPsRAlWGAJ9hcengZeOF0hd1sFl+hKS6Bwi//ACgibUG
i27lqxVyMXFjO96NVHxbm
On Thu, 30 Sep 2004, Costas Zacharopoulos wrote:
Can I combine digest authentication with samba?
Yes, sort of.
What I mean is not to have usernames and passwords to the digest_passwd file,
but to make digest talk directly to a samba server.
No.
But you can use Samba to lookup the gruop memberships
On Thu, 30 Sep 2004 [EMAIL PROTECTED] wrote:
but i like browsers to connect to squid
then squid will connect to proxy AV
then proxy AV will go to the internet
how to tell squid to redirect TCP 80 to proxy AV?
Squid FAQ 4.9 How do I configure Squid forward all requests to another
proxy? http://www.
On 29.09 11:00, Michael Puckett wrote:
> I understand that Squid has been designed and optimized for handling
> large numbers of (relatively) small objects. What I desire to understand
> is if it is possible to configure squid in such a manner as to make it
> reasonably efficient in the handling of
On 29.09 23:06, Roberto wrote:
> My question again: is there any way, if possible through configuration,
> but if nessecary through a source code hack, to determine at the
> httpd_accellerated server wether a request came in through ssl or not?
using header_access with acl myiyp should help. You
On 29.09 14:23, Rick Whitley wrote:
> I have a user that is using GAIM to talk with yahoo messenger. We have
> configured the proxy but when she tries to connect she gets the error:
> 'GAIM Error: Access denied proxy server forbids port 5050 tunnelling'.
> If I look at the conf file I can't see whe
On 29.09 09:05, [EMAIL PROTECTED] wrote:
> Is there a way to tell squid that accept strange caracters like this "_" I
> have tried to get some pages with this caracter but squid does not accept
> them.
do you mean hostnames with underscores?
I'd better advise to fix the hostnames...
--
Matus UHLA
> > On Wed, 29 Sep 2004, Jens Strohschnitter wrote:
> > > sometimes in my messages there will be the following message:
> > >
> > > squid[5230]: WARNING: Disk space over limit: 8 KB > 4 KB
> On Wed, 29 Sep 2004 11:32:47 +0200 (CEST) Henrik Nordstrom <[EMAIL PROTECTED]> wrote:
> > 4 KB?
> >
> > Ho
I am had the same issue some times back, we were using
a DSL connection, and when we try to use yahoo mail
via http, it gives use the same error, 999, but if
used via https, no error, i guess it is the
browser---> proxy>proxy stuff which confuses
yahoo.
where is in many cases https is not pro
thx the fact is, the antivirus proxy scan http on a TCP port, FTP on
another and don t analyse ssl
so i have to find the config which says:
cachepeer http to 127.0.0.1:8090
cachepeer ftp to 127.0.0.1:8091
cachepeer ssl direct_to_internet
thx
>
>
>>
>> Hi
>> i have this situation
>> a browser
thx the fact is, the antivirus proxy scan http on a TCP port, FTP on
another and don t analyse ssl
so i have to find the config which says:
cachepeer http to 127.0.0.1:8090
cachepeer ftp to 127.0.0.1:8091
cachepeer ssl direct_to_internet
thx
>
>
>>
>> Hi
>> i have this situation
>> a browser
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Can I combine digest authentication with samba?
What I mean is not to have usernames and passwords to the digest_passwd file,
but to make digest talk directly to a samba server.
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.2.3 (GNU/Linux)
iD8DBQFB
>
> Hi
> i have this situation
> a browser which connects to an antivirus (hbedv) proxy
> which connects to squid
> then squid connect to internet
>
>
> but i like browsers to connect to squid
> then squid will connect to proxy AV
> then proxy AV will go to the internet
>
> how to tell squi
Hi
i have this situation
a browser which connects to an antivirus (hbedv) proxy
which connects to squid
then squid connect to internet
but i like browsers to connect to squid
then squid will connect to proxy AV
then proxy AV will go to the internet
how to tell squid to redirect TCP 80 to proxy
Hi
i have this situation
a browser which connects to an antivirus (hbedv) proxy
which connects to squid
then squid connect to internet
but i like browsers to connect to squid
then squid will connect to proxy AV
then proxy AV will go to the internet
how to tell squid to redirect TCP 80 to proxy
89 matches
Mail list logo