Re: [squid-users] authentication

2006-07-27 Thread Visolve Squid

Paul wrote:


Hi, I have configure my squid with poxy_auth and all the computers
which use internet use this proxy (of course I need to enter login and
password), but I have a machine which is not possible to enter the
password. Any one knows how can I make an exception with one user. Is
it possible to avoid the squid authentication for one user or IP
address.


Hello Paul,

Yes. You can avoid the squid authentication for one ip address by using 
following ACL configuration in squid.conf file.


auth_param basic program /usr/local/squid/libexec/ncsa_auth 
/usr/local/squid/etc/passwd

acl auth_users proxy_auth REQUIRED
acl restricted src /usr/local/squid/iplist
acl allow_user src 172.16.1.27
http_access allow allow_user
http_access allow auth_users restricted

--
Thanks,
Visolve Squid Team,
http://squid.visolve.com


Re: [squid-users] Any Slackers running on current version with squid current version just a survey

2006-07-27 Thread Marco Berizzi
SSCR Internet Admin wrote:


 Hi,

Ciao

 I would like to know if there are any slackers running on the current
 version with the latest version of squid.  I havent touched slackware

I'm running Slackware 10.2 and squid 2.6-STABLE1
Here is my squid build script:

CFLAGS=-O2 ./configure \
   --prefix=/usr \
  --sysconfdir=/etc/squid \
  --localstatedir=/var/spool/squid \
  --libexecdir=/usr/libexec/squid \
  --datadir=/usr/share/squid \
  --enable-removal-policies=heap \
  --enable-delay-pools \
  --enable-useragent-log \
  --disable-wccp \
  --disable-wccpv2 \
  --enable-ssl \
  --enable-default-err-language=Italian \
  --enable-err-languages=English Italian \
  --enable-epoll \
  --enable-http-violations \
  --disable-ident-lookups \
  --enable-auth=basic ntlm \
  --enable-basic-auth-helpers=MSNT \
  --enable-ntlm-auth-helpers=SMB \
  i486-slackware-linux

CFLAGS=-O2 make all



[squid-users] (111) connection refused ERROR FOR SITES REQUIRING LOGIN

2006-07-27 Thread vinayan K P

Hello,

Hope someone could help me.

I am using a squid proxy (squid-2.5.STABLE13-1.FC4) behind another
squid proxy and firewall.

The my squid.conf is below.

##
cache_dir ufs /var/spool/squid 100 16 256

cache_log   /var/log/squid/cache.log
cache_access_log/var/log/squid/access.log
cache_store_log /var/log/squid/store.log
cache_swap_log  /var/log/squid/swap.log
logfile_rotate  10


cache_replacement_policy GDSF

acl all src 0.0.0.0/0.0.0.0
#http_access deny all

acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255

acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregisterd ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # fultiling http
acl CONNECT method CONNECT

http_access allow manager localhost
http_access allow  manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_reply_access allow all
ipc_access allow all

acl cdelan src 192.168.0.1-192.168.0.254/255.255.255.0
http_access  allow cdelan

cache_peer proxy.duix.org parent 3128 0 no-query
prefer_direct off



Systems which use this particular syststem as proxy can browse every
website except which uses port 443. Eg: www.google.com,
www.hotmail.com, www.yahoo.com.

Following is the error message i get when i try to access www.gmail.com

=
ERROR
The requested URL Could not be retrieved
While trying to retrieve the URL : www.google.com:443
The following error was encountered
 * Connection to 66.102.7.104 Failed
System returned
  (111) connection refused

The remote host or the network may be down.  Please try again
==




Re: [squid-users] CentOS 4.3 and Squid Version

2006-07-27 Thread Peter Albrecht
Hi Domingos and Brad,

[...]

   So, it could be considered an enterprise linux distro (without 
 commercial support, that is), updating packages and fixing stuff only 
 when RedHat does the same on RHEL. This means that all policies 
 regarding this distro (packaging, updating, bug fixes and security bug 
 fixes) takes in account that the product is targeted to an enterprise 
 client (thats the most important phrase in my message).
 
   For a long time, enterprise clients asked for an enterprise linux 
 (and there where none). RedHat was the first to fix this issue, building 
 an enterprise version of its Linux (called RHEL). By enterprise, they 

Well, that is simply not true. I know that many people think Linux = Red Hat 
(or Debian) and do not know much about other distributions. SUSE Linux 
Enterprise Server 7, released in October 2001, has been the first enterprise 
Linux on the market. Red Hat Enterprise Linux 2.1 came out in March 2002. I 
just want to mention that, so no flame wars please. :-)

 meant an operational system which was stable, scalable, well supported 
 and with a distant EOL date. So, the paradigm for this kind of distro is 
 way different from the ones end users and small companies use. We can 
 sum the basic changes in a few lines:
 
 - The life cicle of an enterprise distro is way longer then a normal 
 distro. To get the picture, just compare the EOL date of any Fedora Core 
 version, with any version of Solaris, or even better, Windows. For the 
 matter, Windows 98 finally ended its cicle of life (EOL) this month. 
 RHEL usually have a EOL date of 5 years from the date it was launched 
 (and even more, if the marked asks for it).
 
 - The enterprise distro is not meant to be bleeding edge. Alas, no 
 software is upgraded to the latest version. In truth, developers do the 
 most they can to stay using the same version of any package or lib for 
 the lifecycle of the product. We can break this statement in two parts:
 
   - package updates are made only when security bug fixed are found. 
 Also, the package is not updated to a newer version, but instead, the 
 fix is backported to the current version. If you take a look at RHEL 
 4.x, you'll see that it contains thousands of packages. To stay feature 
 freeze, you must guarantee that all packages are 100% compatible betwen 
 updates (try to do an automatic update of squid 2.5-stable6 to squid 
 2.6-stable1, and see if works at all).
 
   - developers take much more care about updates, and usually, stay away 
 from functionality fixes (unless the bug makes the software useless). 
 So, the number of package updates ir far greater on end user distros, 
 when comparing to enterprise distros. Also, those updates are usually 
 delivered in batches (think of Microsoft service packs. RedHat does the 
 same with RHEL).

I just can second all your other statements. At the end, it's simply a 
question of man power and money. The vendors have to decide what exactly they 
can support and maintain. So they (normally) concentrate only on one version 
of an application like Squid. So, if you rely on support for an application 
from the vendor of your enterprise Linux, you should stay with the version 
they provide. They won't support any other version of the software. If you 
want the latest (stable) technology, compile it yourself, create an RPM 
package, install it and get your support either from the community or an 
independent Linux company around the corner.

Regards,

Peter

-- 
Peter Albrecht, Novell Training Services, [EMAIL PROTECTED]


Re: [squid-users] (111) connection refused ERROR FOR SITES REQUIRING LOGIN

2006-07-27 Thread Henrik Nordstrom
tor 2006-07-27 klockan 13:45 +0530 skrev vinayan K P:
 Hello,
 
 Hope someone could help me.
 
 I am using a squid proxy (squid-2.5.STABLE13-1.FC4) behind another
 squid proxy and firewall.

Perhaps
http://wiki.squid-cache.org/SquidFaq/ConfiguringSquid#head-f7c4c667d4154ec5a9619044ef7d8ab94dfda39b

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


RE: [squid-users] Squid stopped and restarted on its own: Why pls help. Version 2.6 Squid

2006-07-27 Thread Mehmet, Levent \(Accenture\)

Hi Henrik

Squid version 2.6 stable 1
Operating system Suse Linux 10.1
Description: Internet stopped working and within the logs these messages
were found:
15:41:05 FATAL: RECIEVED SEGMENT VIOLATIONDYING
  14:41:05 ctx: enter level 0: 

Attached is the coredump file


2006/07/25 15:00:06| sslReadServer: FD 106: read failure: (104)
Connection reset by peer
2006/07/25 15:06:16| sslReadServer: FD 175: read failure: (104)
Connection reset by peer
2006/07/25 15:12:24| httpReadReply: Excess data from GET
http://31442.r.msn.com/?type=1cp=5050sku=0;
2006/07/25 15:15:28| httpReadReply: Excess data from GET
http://31442.r.msn.com/?type=1cp=5050sku=0;
2006/07/25 15:19:58| sslReadServer: FD 110: read failure: (104)
Connection reset by peer
2006/07/25 15:20:56| sslReadServer: FD 205: read failure: (104)
Connection reset by peer
2006/07/25 15:22:35| sslReadServer: FD 275: read failure: (104)
Connection reset by peer
2006/07/25 15:23:17| urlParse: Illegal character in hostname
'www,pharmiweb.co.uk'
2006/07/25 15:41:05| clientProcessRequest2: ETag loop
FATAL: Received Segment Violation...dying.
2006/07/25 15:41:05| ctx: enter level  0:
'http://www.jules.lister.com/misc/images/car1.jpg'
2006/07/25 15:41:05| storeDirWriteCleanLogs: Starting...
2006/07/25 15:41:05| WARNING: Closing open FD7
2006/07/25 15:41:05| 65536 entries written so far.
2006/07/25 15:41:05|131072 entries written so far.
2006/07/25 15:41:05| WARNING: Closing open FD   10
2006/07/25 15:41:05|   Finished.  Wrote 172661 entries.
2006/07/25 15:41:05|   Took 0.2 seconds (926024.6 entries/sec).
CPU Usage: 22063.735 seconds = 7578.714 user + 14485.021 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 0
Memory usage for squid via mallinfo():
total space in arena:   83132 KB
Ordinary blocks:77585 KB   2039 blks
Small blocks:   0 KB  6 blks
Holding blocks:   208 KB  1 blks
Free Small blocks:  0 KB
Free Ordinary blocks:5546 KB
Total in use:   77793 KB 94%
Total free:  5547 KB 7%
2006/07/25 15:41:08| Starting Squid Cache version 2.6.STABLE1 for
i686-pc-linux-gnu...
2006/07/25 15:41:08| Process ID 29209
2006/07/25 15:41:08| With 1024 file descriptors available
2006/07/25 15:41:08| Performing DNS Tests...
2006/07/25 15:41:08| Successful DNS name lookup tests...
2006/07/25 15:41:08| DNS Socket created at 0.0.0.0, port 32807, FD 4
2006/07/25 15:41:08| Adding nameserver 192.168.201.10 from squid.conf
2006/07/25 15:41:08| Adding nameserver 62.244.177.177 from squid.conf
2006/07/25 15:41:08| Adding nameserver 62.244.176.176 from squid.conf
2006/07/25 15:41:08| Unlinkd pipe opened on FD 9
2006/07/25 15:41:08| Swap maxSize 2048000 KB, estimated 157538 objects
2006/07/25 15:41:08| Target number of buckets: 3150
2006/07/25 15:41:08| Using 8192 Store buckets
2006/07/25 15:41:08| Max Mem  size: 32768 KB
2006/07/25 15:41:08| Max Swap size: 2048000 KB
2006/07/25 15:41:08| Local cache digest enabled; rebuild/rewrite every
3600/3600 sec
2006/07/25 15:41:08| Store logging disabled
2006/07/25 15:41:08| Rebuilding storage in /home/squid/cache (CLEAN)
2006/07/25 15:41:08| Using Least Load store dir selection
2006/07/25 15:41:08| chdir: /var/spool/squid: (2) No such file or
directory
2006/07/25 15:41:08| Current Directory is /opt/squid/sbin
2006/07/25 15:41:08| Loaded Icons.
2006/07/25 15:41:09| Accepting proxy HTTP connections at 192.168.201.10,
port 3128, FD 10.
2006/07/25 15:41:09| Accepting ICP messages at 0.0.0.0, port 3130, FD
11.
2006/07/25 15:41:09| WCCP Disabled.
2006/07/25 15:41:09| Configuring Parent 51.63.241.220/80/80
2006/07/25 15:41:09| Configuring Parent 51.63.249.220/80/80
2006/07/25 15:41:09| Configuring Parent 192.168.201.8/3128/3130
2006/07/25 15:41:09| Ready to serve requests.
2006/07/25 15:41:09| Store rebuilding is  2.4% complete
2006/07/25 15:41:12| Done reading /home/squid/cache swaplog (172661
entries)
2006/07/25 15:41:12| Finished rebuilding storage from disk.
2006/07/25 15:41:12|172661 Entries scanned
2006/07/25 15:41:12| 0 Invalid entries.
2006/07/25 15:41:12| 0 With invalid flags.
2006/07/25 15:41:12|172520 Objects loaded.
2006/07/25 15:41:12| 0 Objects expired.
2006/07/25 15:41:12| 0 Objects cancelled.
2006/07/25 15:41:12|60 Duplicate URLs purged.
2006/07/25 15:41:12|81 Swapfile clashes avoided.
2006/07/25 15:41:12|   Took 4.0 seconds (43291.4 objects/sec).
2006/07/25 15:41:12| Beginning Validation Procedure
2006/07/25 15:41:12|   Completed Validation Procedure
2006/07/25 15:41:12|   Validated 172520 Entries
2006/07/25 15:41:12|   store_swap_size = 1842668k
2006/07/25 15:41:13| storeLateRelease: released 2 objects
2006/07/25 15:41:13| clientProcessRequest2: ETag loop
FATAL: Received Segment Violation...dying.
2006/07/25 15:41:13| ctx: enter level  0:
'http://www.jules.lister.com/ramblings.htm'
2006/07/25 15:41:13| 

RE: [squid-users] Squid stopped and restarted on its own: Why pls help. Version 2.6 Squid

2006-07-27 Thread Henrik Nordstrom
tor 2006-07-27 klockan 11:30 +0100 skrev Mehmet, Levent (Accenture):
 Hi Henrik
 
 Squid version 2.6 stable 1
 Operating system Suse Linux 10.1
 Description: Internet stopped working and within the logs these messages
 were found:
   15:41:05 FATAL: RECIEVED SEGMENT VIOLATIONDYING
 14:41:05 ctx: enter level 0: 

Please file a bug report following the instructions in the FAQ
http://wiki.squid-cache.org/SquidFaq/TroubleShooting#head-7067fc0034ce967e67911becaabb8c95a34d576d

 Attached is the coredump file

The coredump is only useful on your system. What you need to include in
the bug report is the extracted stack trace per the instruction above.

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


Re: [squid-users] Blocking Searches with squidguard

2006-07-27 Thread Brian Gregory

Rikunj wrote:

Yes, it is.
*
url_regex*: URL regular expression pattern matching
http://www.squid-cache.org/Doc/FAQ/FAQ-10.html#ss10.4

Rikunj



I don't think I'm going to get this just by looking at the 
documentation. I only understand technical things when I begin to see 
what was going on in the mind of the person who designed it. Here all 
the documentation seems to assume you already understand most of the 
concepts.


I'm using squid 2.5.STABLE10 and squidguard 1.2.0 on SuSE 10.0.

At present out squidguard.conf looks like this:

--BEGIN squidguard.conf-
logdir /var/log/squidGuard
dbhome /var/lib/squidGuard/db/blacklists

dest prospect-goodstuff {
domainlist  prospect-goodstuff/domains
urllist prospect-goodstuff/urls
expressionlist  prospect-goodstuff/expressions
}

dest prospect-badstuff {
domainlist  prospect-badstuff/domains
urllist prospect-badstuff/urls
expressionlist  prospect-badstuff/expressions
}

dest adult {
domainlist  adult/domains
urllist adult/urls
expressionlist  adult/expressions
expressionlist  adult/very_restrictive_expression
}
dest agressif {
domainlist  agressif/domains
urllist agressif/urls
expressionlist  agressif/expressions
}
dest audio-video {
domainlist  audio-video/domains
urllist audio-video/urls
expressionlist  audio-video/expressions
}
dest dangerous_material {
domainlist  dangerous_material/domains
urllist dangerous_material/urls
expressionlist  dangerous_material/expressions
}
...
dest warez {
domainlist  warez/domains
urllist warez/urls
expressionlist  warez/expressions
}
acl {
default {
		pass prospect-goodstuff !prospect-badstuff !adult !agressif 
!audio-video !dangerous_material ... !warez all
		redirect 
http://localhost/cgi-bin/squidGuard?clientaddr=%aclientname=%nclientuser=%iclientgroup=%surl=%u

}
}

---END squidguard.conf--

All the lists except the first two (names beginning prospect) are 
re-downloaded regularly by a cron job running as root which then does 
the following:


BEGIN

cd ~

chown -R squid:nogroup /var/lib/squidGuard/db

echo Compiling...

/usr/sbin/squidGuard -C all

chown -R squid:nogroup /var/lib/squidGuard/db

echo Reconfiguring...

/usr/sbin/squid -k reconfigure

chown -R squid:nogroup /var/lib/squidGuard/db

echo Done.

-END-

I have two problems at the moment.

1. Expressions I've added in .../prospect-badstuff/expressions appear to 
 be totally ignored.


- BEGIN .../prospect-badstuff/expressions 
(^|[-\.\?+=/_])(rude1|rude2|rude3|rude4|rude5|...)([-\.\?+=/_]|$)
(^|[-\.\?+=/_])(hot|hardcore|big|cyber|hard|huge|mega|small|soft|super|tiny)?(bad1|bad2|bad3|bad4|bad5|...|xxx+)s?([-\.\?+=/_]|$)
-- END .../prospect-badstuff/expressions -

For example URLs with rude1 in them are not blocked.


2. I'm unsure what is supposed to go in the domains files such as 
.../prospect-goodstuff/domains. By trial and error I've found adding 
both domain.com and www.domain.com works well enough but I really want 
to match domain.com and anything.domain.com. I found some documentation 
that suggested I put .domain.com in the domains file but that doesn't 
appear to match anything at all.


3. Blocked squid attempts to redirect https: URLs to http:443. Can I 
make an error message show instead?



Please help.
TIA

--

Brian Gregory.
[EMAIL PROTECTED]

Computer Room Volunteer.
Therapy Centre.
Prospect Park Hospital.


[squid-users] HTML Caching

2006-07-27 Thread Andrew Yoward

Hi Folks,

Am I just blind, or can someone point me to the Wiki page that tells me 
how to turn off caching of HTML?


Thanks,

Andrew


Re: [squid-users] HTML Caching

2006-07-27 Thread Henrik Nordstrom
tor 2006-07-27 klockan 16:15 +0100 skrev Andrew Yoward:
 Hi Folks,
 
 Am I just blind, or can someone point me to the Wiki page that tells me 
 how to turn off caching of HTML?

See the cache directive.

acl html rep_mime_type -i text/html
cache deny html

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


[squid-users] caching geoserver

2006-07-27 Thread Peppo Herney
Hello,

I hope someone can help me out on this:
I have set up a geoserver http://docs.codehaus.org/display/GEOS/Home
and I do not want it to generate the same map over and over again, because it 
consumes a lot of processor time. Therefore I installed squid and everything 
seems to work fine, except it does not cache anything. It is used in the httpd 
accelerator mode.
Here is some output:
access.log: 
1154016642.734   1328 129.26.149.240 TCP_MISS/200 199354 GET 
http://129.26.151.234:8080/geoserver/wms? - DIRECT/129.26.151.234 image/png
1154016617.453   1282 129.26.149.240 TCP_MISS/200 199354 GET 
http://129.26.151.234:8080/geoserver/wms? - DIRECT/129.26.151.234 image/png
store.log: 
1154016642.734 RELEASE -1  5A93FB21763780D7C3786C5BEF849EAD  200 
1154016642-1-1 image/png -1/199104 GET 
http://129.26.151.234:8080/geoserver/wms
1154016617.453 RELEASE -1  F3EB0D7BED4F626F1BB4877507AED140  200 
1154016617-1-1 image/png -1/199104 GET 
http://129.26.151.234:8080/geoserver/wms?
the next line has a different initial number.
Which header information makes squid cache a request and how does it recognize 
a later request to be the same?
Is there any configuration I could change?
Thank you for helping me out.

greetings

Peppo 


Re: [squid-users] CentOS 4.3 and Squid Version

2006-07-27 Thread Domingos Parra Novo

Hiyas,

Peter Albrecht escreveu:
	For a long time, enterprise clients asked for an enterprise linux 
(and there where none). RedHat was the first to fix this issue, building 
an enterprise version of its Linux (called RHEL). By enterprise, they 


Well, that is simply not true. I know that many people think Linux = Red Hat 
(or Debian) and do not know much about other distributions. SUSE Linux 
Enterprise Server 7, released in October 2001, has been the first enterprise 
Linux on the market. Red Hat Enterprise Linux 2.1 came out in March 2002. I 
just want to mention that, so no flame wars please. :-)


	Heh, my bad. I wasn't aware of the release dates, and at that time, had 
the PR people (from RedHat) saying they where the first. Gotta learn to 
never trust statements from any PR guy around. ;-)


	By the way, I suffered from the same problem (linux == redhat linux). 
Conectiva always had the karma of being a RedHat clone, even not 
gleaning from RH a single spec or package for years. Yes, we even 
reinvented the whell sometimes. :)



Regards,

Peter


Regards,

Domingos.


[squid-users] Having to login multiple times??

2006-07-27 Thread Ryan McCain
hello, we are just starting with squid. when a user opens a web browser
it will ask them to login if they go to yahoo.com, or whatever. if they
open a new browser session and go to another site, it will ask them
again for their user credentials.

is it possible to have the login credentials to the squid server
somehow cached so the users won't  have to login everytime the  open a
new instance of IE or firefox.

we are running squid v2.5x on SLES 9. any help or pointers to
documentation would be helpful.





Re: [squid-users] XP IE 6.x Machines Ignoring Proxy - Squid 2.5.14

2006-07-27 Thread eric . watters
It works fine if we manually configure the location of the PAC file in the 
browser.  However, our end users (about 2000 of them) won't go for having 
to uncheck that option if we push it out by GPO.  I hadn't had any luck 
googling the symptoms so I was hoping somebody on the list may have 
experienced this issue.  We never noticed it before because we hadn't been 
locking down the firewalls very tight.   So when we started forcing people 
through the proxy we didn't realize the issue because they would just go 
out direct to the web if they didn't go to the proxy.  With the Default 
Deny, if they initate any traffic to any Non-RFC 1918 space the traffic is 
dropped.  So that is when we noticed the issue.  Thoughts ?

Regards,

Eric Watters
Network Engineer
PRG Schultz
Desk: 770.779.3318
Cell:   404.247.0646






[EMAIL PROTECTED] 
07/25/2006 02:03 PM

To
squid-users@squid-cache.org
cc

Subject
[squid-users] XP IE 6.x  Machines Ignoring Proxy - Squid 2.5.14






Hello AllI have been rolling out a Default Deny policy on all my 


remotely connected VPN Sites.   This policy drops all non RFC-1918 IP 
space at the remote locations firewall.  We are auto-detecting via Group 
Policy.  The endusers have no problem resolving wpad EVER.  They can ping 
wpad all the time.  However, half the time these users are going directly 
to the web instead of the proxy.   Consequently, unless I allow outbound 
http and https access on the remote firewalls access-list applied to the 
internal interface (remote LAN facing), I get inconsistent web access. 
Page Cannot Be Displayed a LOT of the time.  This happens EVERYWHERE 
meaning all the remote locations I have made the firewall change.  I will 
locate a user experiencing this issue and will debug on the remote 
firewall for their web traffic.  What I see is that for a few seconds the 
end user makes calls to the Virtual Address on the Load Balancers at our 
corporate office (as designed) and then suddenly see a flurry of traffic 
trying to access the Public IP's of sayhotmail.com or yahoo.com. I 


am stumped and have no idea why this is happening. 

Regards,

Eric Watters
Network Engineer
PRG Schultz
Desk: 770.779.3318
Cell:   404.247.0646





Re: [squid-users] XP IE 6.x Machines Ignoring Proxy - Squid 2.5.14

2006-07-27 Thread Covington, Chris
On Thu, Jul 27, 2006 at 12:45:33PM -0400, [EMAIL PROTECTED] wrote:
 It works fine if we manually configure the location of the PAC file in the 
 browser.  However, our end users (about 2000 of them) won't go for having 
 to uncheck that option if we push it out by GPO.  I hadn't had any luck 

I don't understand what you mean by this.  If you're doing it via GPO,
your users won't have to do anything.

The best way to solve the problem is to force the automatic
configuration script in the \User Configuration\Windows Settings\Internet 
Explorer Maintenance\Connection\Automatic Browser Configuration GPO 
then disable the users' abilities to get to that page in IE:

\User Configuration\Administrative Templates\Windows Components\Internet
Explorer\Disable changing proxy settings (Enabled) 

OR

If you set the Disable the Connections page policy (located in \User
Configuration\Administrative Templates\Windows Components\Internet
Explorer\Internet Control Panel), you do not need to set this policy,
because the Disable the Connections page policy removes the
Connections tab from the interface.

---
Chris Covington
IT
Plus One Health Management
75 Maiden Lane Suite 801
NY, NY 10038
646-312-6269
http://www.plusoneactive.com


[squid-users] block file download

2006-07-27 Thread amit ash


Hi everyone,

I was just searching on how to block file download through squid 
and got my hands on the code below. I implemented it on my linux 
server [suse 10 - squid 2.5] and it worked. I have read many 
querries in this forum regd this same issue so posting it for 
everyone.


Amit Ash


acl extndeny url_regex -i /etc/squid/extndeny
acl download method GET


http_access deny extndeny download
http_access deny extndeny[/code:1:81117a2bfd]
-save and close --

Now lets create extndeny file ..this is the list of file 
extensions which we are blocking in SQUID. make a file and add 
these file extensions vi /etc/squid/extndeny --

\.ez$
\.hqx$
\.cpt$
\.dot$
\.wrd$
\.bin$
\.dms$
\.lha$
\.lzh$
\.ace$
\.r00$
\.r01$
\.exe$
\.wp5$
\.wk$
\.wz$
\.vcd$
\.bz2$
\.deb$
\.dvi$
\.tar$
\.gtar$
\.tgz$
\.gz$
\.bat$
\.rpm$
\.spm$
\.zip$
\.mid$
\.midi$
\.kar$
\.mpga$
\.mp2$
\.mp3$
\.ra$
\.dl$
\.fli$
\.gl$
\.mpe$
\.mpeg$
\.mpg$
\.qt$
\.mov$
\.avi$
\.movie$
\.wav$
\.au$
\.asf$
\.af$
\.bin$
\.gz$
\.bz2$
\.asx$
\.afx$
\.asf$
\.asx$
\.au$
\.avi$
\.divx$
\.m3u$
\.mov$
\.mp2$
\.mp3$
\.mpeg$
\.mpg$
\.qt$
\.ra$
\.ram$
\.rm$
\.viv$
\.vivo$
\.vob$
\.vqf$
\.wav$
\.wma$
\.wmv$
\.vbs$
\.shs$
\.pif$
\.wpm$
\.wvx$

Now restart squid



Re: [squid-users] authentication

2006-07-27 Thread Paul
I have a doubt, what is the format of iplist file, can I a range of IP 
address? and how?

thanks

On 27/07/2006, at 01:14 AM, Visolve Squid wrote:


Paul wrote:


Hi, I have configure my squid with poxy_auth and all the computers
which use internet use this proxy (of course I need to enter login and
password), but I have a machine which is not possible to enter the
password. Any one knows how can I make an exception with one user. Is
it possible to avoid the squid authentication for one user or IP
address.


Hello Paul,

Yes. You can avoid the squid authentication for one ip address by 
using following ACL configuration in squid.conf file.


auth_param basic program /usr/local/squid/libexec/ncsa_auth 
/usr/local/squid/etc/passwd

acl auth_users proxy_auth REQUIRED
acl restricted src /usr/local/squid/iplist
acl allow_user src 172.16.1.27
http_access allow allow_user
http_access allow auth_users restricted

--
Thanks,
Visolve Squid Team,
http://squid.visolve.com




[squid-users] Authenticate MAC Address from Mysql

2006-07-27 Thread Sebastian

Hi !
Can Squid authenticate by MAC Address reading from a Mysql table?

I can make it with User and Password (with squid2mysql tool) but not with 
MAC address or IP.


Is this possible?
thanks in advance!! 





Re: [squid-users] caching geoserver

2006-07-27 Thread Henrik Nordstrom
tor 2006-07-27 klockan 18:14 +0200 skrev Peppo Herney:
 Hello,
 
 I hope someone can help me out on this:
 I have set up a geoserver http://docs.codehaus.org/display/GEOS/Home
 and I do not want it to generate the same map over and over again, because it 
 consumes a lot of processor time. Therefore I installed squid and everything 
 seems to work fine, except it does not cache anything. It is used in the 
 httpd accelerator mode.
 Here is some output:
 access.log: 
 1154016642.734   1328 129.26.149.240 TCP_MISS/200 199354 GET 
 http://129.26.151.234:8080/geoserver/wms? - DIRECT/129.26.151.234 image/png

1. Enable log_query_terms to have the full URL logged and make sure it's
identical.

2. Make sure you have not blocked caching of query URLs. The suggested
default config does.. See the cache directive.

3. If still no luck, check with the cacheability check engine to make
sure the response isn't blocked from caching by the application.

4. If still in doubt, enable log_mime_hdrs and post a couple of entries
(two, max three) and we will try helping you finding the correct knobs
to make Squid cache the object.

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


Re: [squid-users] authentication

2006-07-27 Thread Henrik Nordstrom
tor 2006-07-27 klockan 13:48 -0500 skrev Paul:
 I have a doubt, what is the format of iplist file, can I a range of IP 
 address? and how?

The format of included acl files is the same as in squid.conf, but with
a single entry per line. Meaning that any of the following forms can be
used:

 172.0.2.1  specific IP (*)
 172.0.2.1/32   specific IP
 172.0.2.1/255.255.255.255  specific IP
 172.0.2.0/24   Whole 172.0.2.X network
 172.0.2.0/255.255.255.0Whole 172.0.2.X network
 172.0.2.1-172.0.2.42   IP range (*)
 192.168.1.0-192.168.5.0/24 Network range
 192.168.1.0-192.168.6.0/255.255.255.0  Network range

Note: The forms marked with * is a bit magic in older versions of Squid,
where if the address ends in .0 Squid then assumes it's a network..
Current versions always reads it as an explicit IP if there is no mask
specified.

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


[squid-users] inject object into cache

2006-07-27 Thread Pranav Desai

Hello,

Is it possible to inject a specific object into the cache store and
associate it with a particular URL ?

E.g. a gif on the disk needs to be included in the cache store as say
http://www.google.com/logo.gif.
So, when someone accesses http://www.google.com/logo.gif, they will
get the gif that was on the disk.

Or maybe it can be included in a special user request (like PURGE),
where you would give the entire object in the user request.

Any suggestions ?

Thanks

-- Pranav

--
http://pd.dnsalias.org


[squid-users] Transparent SQUID and VLAN

2006-07-27 Thread Charles Regan

I want to use a transparent squid proxy (only web traffic).
This is easy I know, the problem is that I am using VLAN on my network.
My Router has muliple subinterface configured for each vlan.
The tranparent bridge has subinterfaces configured for each VLAN.
Will it be possible to use a transparent squid proxy with this kind of setup ?


Users on different VLAN  SWITCH -
 transparent bridge - ROUTER With subinterface -- INTERNET

Let me know,
C.


Re: [squid-users] (111) connection refused ERROR FOR SITES REQUIRING LOGIN

2006-07-27 Thread vinayan K P

Dear Mr. Henrik,

Thanks a lot for your mail and finally I could manage it.
The
never_direct allow all
did it.

Thanks once again.

Vinayan

On 7/27/06, Henrik Nordstrom [EMAIL PROTECTED] wrote:

tor 2006-07-27 klockan 13:45 +0530 skrev vinayan K P:
 Hello,

 Hope someone could help me.

 I am using a squid proxy (squid-2.5.STABLE13-1.FC4) behind another
 squid proxy and firewall.

Perhaps
http://wiki.squid-cache.org/SquidFaq/ConfiguringSquid#head-f7c4c667d4154ec5a9619044ef7d8ab94dfda39b

Regards
Henrik


-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.4 (GNU/Linux)

iD8DBQBEyIHpB5pTNio2V7IRAqucAKDH4eh+Kx7+eTI+hNiWA189ClEwwwCfWRh+
uIm3ZU9OWI2YuTyRZgJjiPw=
=1i7M
-END PGP SIGNATURE-





Re: [squid-users] Blocking Searches with squidguard

2006-07-27 Thread SM

At 04:55 27-07-2006, Brian Gregory wrote:
I don't think I'm going to get this just by looking at the 
documentation. I only understand technical things when I begin to 
see what was going on in the mind of the person who designed it. 
Here all the documentation seems to assume you already understand


http://www.squidguard.org/config/

[snip]


I have two problems at the moment.

1. Expressions I've added in .../prospect-badstuff/expressions 
appear to  be totally ignored.


- BEGIN .../prospect-badstuff/expressions 
(^|[-\.\?+=/_])(rude1|rude2|rude3|rude4|rude5|...)([-\.\?+=/_]|$)


The above should work.  See your SquidGuard log for any errors or warnings.

2. I'm unsure what is supposed to go in the domains files such as 
.../prospect-goodstuff/domains. By trial and error I've found adding 
both domain.com and www.domain.com works well enough but I really 
want to match domain.com and anything.domain.com. I found some 
documentation that suggested I put .domain.com in the domains file 
but that doesn't appear to match anything at all.


As the name implies, that file contains the list of domains.

Regards,
-sm 



Re: [squid-users] Transparent SQUID and VLAN

2006-07-27 Thread Henrik Nordstrom
tor 2006-07-27 klockan 20:18 -0300 skrev Charles Regan:
 I want to use a transparent squid proxy (only web traffic).
 This is easy I know, the problem is that I am using VLAN on my network.
 My Router has muliple subinterface configured for each vlan.
 The tranparent bridge has subinterfaces configured for each VLAN.
 Will it be possible to use a transparent squid proxy with this kind of setup ?

Yes. The VLANs shouldn't make much difference. Effectively the same
thing as having many network interfaces.

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel