[squid-users] Deny facebook chat access for my network

2010-07-12 Thread Harish Pokharel
I have tried many possibilities for denying the chat access of
facebook using squid but still unsuccessful. Is there any idea denying
access to only chat section of facebook.

Regards,
Harish


[squid-users] squidGuard Stopped

2010-07-12 Thread squidACL

Good Day

I work with squidGuard to do the filtre , it's working well but i dont know
after each tow days  the squidGuard stopped 

I did  squidGuard -C all  and  squid -k reconfigure 

how can i do to live the squidGuard stay started ? 

I will be thankfull if you can help me about this issue 

2010-07-12 09:36:24 [12169] New setting: dbhome: /var/squidGuard/blacklists
2010-07-12 09:36:24 [12169] New setting: logdir: /var/log/squid
2010-07-12 09:36:24 [12169] init domainlist
/var/squidGuard/blacklists/adult/domains
2010-07-12 09:36:24 [12169] loading dbfile
/var/squidGuard/blacklists/adult/domains.db
2010-07-12 09:36:24 [12166] init urllist
/var/squidGuard/blacklists/hacking/urls
2010-07-12 09:36:24 [12166] loading dbfile
/var/squidGuard/blacklists/hacking/urls.db
2010-07-12 09:36:24 [12166] init domainlist
/var/squidGuard/blacklists/games/domains
2010-07-12 09:36:24 [12166] loading dbfile
/var/squidGuard/blacklists/games/domains.db
2010-07-12 09:36:24 [12166] init urllist
/var/squidGuard/blacklists/games/urls
2010-07-12 09:36:24 [12166] loading dbfile
/var/squidGuard/blacklists/games/urls.db
2010-07-12 09:36:24 [12166] init domainlist
/var/squidGuard/blacklists/audio-video/domains
2010-07-12 09:36:24 [12166] loading dbfile
/var/squidGuard/blacklists/audio-video/domains.db
2010-07-12 09:36:24 [12166] init urllist
/var/squidGuard/blacklists/audio-video/urls
2010-07-12 09:36:24 [12166] loading dbfile
/var/squidGuard/blacklists/audio-video/urls.db
2010-07-12 09:36:24 [12166] init domainlist
/var/squidGuard/blacklists/redirector/domains
2010-07-12 09:36:24 [12166] loading dbfile
/var/squidGuard/blacklists/redirector/domains.db
2010-07-12 09:36:24 [12166] init urllist
/var/squidGuard/blacklists/redirector/urls
2010-07-12 09:36:24 [12166] loading dbfile
/var/squidGuard/blacklists/redirector/urls.db
2010-07-12 09:36:24 [12168] init domainlist
/var/squidGuard/blacklists/warez/domains
2010-07-12 09:36:24 [12168] loading dbfile
/var/squidGuard/blacklists/warez/domains.db
2010-07-12 09:36:24 [12168] init urllist
/var/squidGuard/blacklists/warez/urls
2010-07-12 09:36:24 [12168] loading dbfile
/var/squidGuard/blacklists/warez/urls.db
2010-07-12 09:36:24 [12168] init domainlist
/var/squidGuard/blacklists/chat/domains
2010-07-12 09:36:24 [12168] loading dbfile
/var/squidGuard/blacklists/chat/domains.db
2010-07-12 09:36:24 [12167] init domainlist
/var/squidGuard/blacklists/aggressive/domains
2010-07-12 09:36:24 [12167] loading dbfile
/var/squidGuard/blacklists/aggressive/domains.db
2010-07-12 09:36:24 [12167] init urllist
/var/squidGuard/blacklists/aggressive/urls
2010-07-12 09:36:24 [12167] loading dbfile
/var/squidGuard/blacklists/aggressive/urls.db
2010-07-12 09:36:24 [12167] init domainlist
/var/squidGuard/blacklists/blog/domains
2010-07-12 09:36:24 [12167] loading dbfile
/var/squidGuard/blacklists/blog/domains.db
2010-07-12 09:36:24 [12167] init urllist
/var/squidGuard/blacklists/blog/urls
2010-07-12 09:36:24 [12167] loading dbfile
/var/squidGuard/blacklists/blog/urls.db
2010-07-12 09:36:24 [12167] init domainlist
/var/squidGuard/blacklists/hacking/domains
2010-07-12 09:36:24 [12167] loading dbfile
/var/squidGuard/blacklists/hacking/domains.db
2010-07-12 09:36:24 [12167] init urllist
/var/squidGuard/blacklists/hacking/urls
2010-07-12 09:36:24 [12167] loading dbfile
/var/squidGuard/blacklists/hacking/urls.db
2010-07-12 09:36:24 [12169] init urllist
/var/squidGuard/blacklists/adult/urls
2010-07-12 09:36:24 [12169] loading dbfile
/var/squidGuard/blacklists/adult/urls.db
2010-07-12 09:36:24 [12169] init domainlist
/var/squidGuard/blacklists/publicite/domains
2010-07-12 09:36:24 [12169] loading dbfile
/var/squidGuard/blacklists/publicite/domains.db
2010-07-12 09:36:24 [12169] init urllist
/var/squidGuard/blacklists/publicite/urls
2010-07-12 09:36:24 [12169] loading dbfile
/var/squidGuard/blacklists/publicite/urls.db
2010-07-12 09:36:24 [12169] init domainlist
/var/squidGuard/blacklists/warez/domains
2010-07-12 09:36:24 [12169] loading dbfile
/var/squidGuard/blacklists/warez/domains.db
2010-07-12 09:36:24 [12169] init urllist
/var/squidGuard/blacklists/warez/urls
2010-07-12 09:36:24 [12169] loading dbfile
/var/squidGuard/blacklists/warez/urls.db
2010-07-12 09:36:24 [12170] New setting: dbhome: /var/squidGuard/blacklists
2010-07-12 09:36:24 [12170] New setting: logdir: /var/log/squid
2010-07-12 09:36:24 [12170] init domainlist
/var/squidGuard/blacklists/adult/domains
2010-07-12 09:36:24 [12170] loading dbfile
/var/squidGuard/blacklists/adult/domains.db
2010-07-12 09:36:24 [12170] init urllist
/var/squidGuard/blacklists/adult/urls
2010-07-12 09:36:24 [12170] loading dbfile
/var/squidGuard/blacklists/adult/urls.db
2010-07-12 09:36:24 [12166] init domainlist
/var/squidGuard/blacklists/strong_redirector/domains
2010-07-12 09:36:24 [12166] loading dbfile
/var/squidGuard/blacklists/strong_redirector/domains.db
2010-07-12 09:36:24 [12166] init urllist
/var/squidGuard/blacklists/strong_redirector/urls
2010-07-12 09:36:24 [12166] 

Re: [squid-users] Deny facebook chat access for my network

2010-07-12 Thread Struzik Wojciech
try this

http://www.buttonclicker.com/2010/06/04/block-facebook-chat-using-squid-acl/


Maybe this will be helpfull too:
http://www.geekride.com/index.php/block-gmail-chat-gtalk-google-talk-squid-proxy/


On Mon, Jul 12, 2010 at 10:00 AM, Harish Pokharel
harish.pokha...@gmail.com wrote:
 I have tried many possibilities for denying the chat access of
 facebook using squid but still unsuccessful. Is there any idea denying
 access to only chat section of facebook.

 Regards,
 Harish



[squid-users] Check for size of webpage

2010-07-12 Thread Rene Wijninga
Hi,

We are using Squid as reverse proxy, works like a breezz.
We recently implemented Squid also as reverse proxy for our Citrix
environment for loadbalancing our Provisioning servers. Works perfectly, but...

Sometimes the XML file that is being send from the provisioning servers has
 a length of zero bytes and is corrupt. This has nothing to do with Squid,
when we connect to the provisioning servers directly, we get the same result.

What I would like to do, is have Squid inspect the length of the file. Now
my three questions are:

1) can Squid inspect the length of the xml page?
2) can Squid inspect the xml page for certain content?
3) can Squid loadbalance with detection of non responsive servers or the give 
back an xml file with length 0?

Sorry if this has been asked before, I just couldn't find it.

Regards,
Rene

[squid-users] Problem accessing site with variable in URL

2010-07-12 Thread Baird, Josh
Hi,

I have a Squid 2.6STABLE-21 (EL5) forward proxy that is having problems
with one site:

http://gw.vtrenz.net/?DPO95NI5KU

It looks like Squid is dropping the text in the URL after the ?,
causing the remote website to return the incorrect data:

1278944523.919   1223 172.26.103.175 TCP_MISS/500 1723 GET
http://gw.vtrenz.net/? - DIRECT/74.112.68.36 text/html

Is this normal for access_log's to drop variables like this, or is Squid
really requesting the URL without the additional text after the ??  Is
anyone else able to reproduce this?

Thanks,

Josh


[squid-users] ntlm locking user accounts in 2003 AD

2010-07-12 Thread Stacker Hush
Hello to all,

I'm having problem using this enviroment:
Squid 2.7.STABLE7
Samba 3.4.7

Squid.conf
auth_param ntlm program /usr/bin/ntlm_auth
--helper-protocol=squid-2.5-ntlmssp
auth_param ntlm children 5
auth_param ntlm keep_alive on

auth_param basic program /usr/bin/ntlm_auth
--helper-protocol=squid-2.5-ntlmssp
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours

smb.conf

workgroup = domain
netbios name = NETSERVER
server string = PROXY SERVER
load printers = no
log file = /var/log/samba/log.%m
max log size = 500
winbind trusted domains only = yes
realm = domain.ltd
security = ads
auth methods = winbind
password server = Server.domain.ltd
winbind separator = +
encrypt passwords = yes
winbind cache time = 3600
winbind enum users = yes
winbind enum groups = yes
winbind use default domain = false
idmap uid = 1-2
idmap gid = 1-2
local master = no
os level = 233
domain master = no
preferred master = no
domain logons = no
wins server = 10.0.0.249, 10.0.0.250
dns proxy = no
ldap ssl = no
load printers = no
template shell = /sbin/nologin


The problem is when some user request webpages i have alot with of 680 EVENT
(logon) in Windows events/security, with seconds of interval  and sometimes
the user account are locked.
I supose the account is locked because user makes alot of authentication
requests. 

Some way to fix this?

Thanks,
Stacker



Re: [squid-users] Check for size of webpage

2010-07-12 Thread Amos Jeffries

Rene Wijninga wrote:

Hi,

We are using Squid as reverse proxy, works like a breezz.
We recently implemented Squid also as reverse proxy for our Citrix
environment for loadbalancing our Provisioning servers. Works perfectly, but...

Sometimes the XML file that is being send from the provisioning servers has
 a length of zero bytes and is corrupt. This has nothing to do with Squid,
when we connect to the provisioning servers directly, we get the same result.

What I would like to do, is have Squid inspect the length of the file. Now
my three questions are:

1) can Squid inspect the length of the xml page?


Squid already verifies the Content-Length: header matches the body size. 
There is nothing else that can be done.



2) can Squid inspect the xml page for certain content?


No. Though eCAP/ICAP plugins can.


3) can Squid loadbalance with detection of non responsive servers or the give 
back an xml file with length 0?


non-responsive; yes.
sending zero-length files: No. zero is normally a valid file length.

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.5


Re: [squid-users] Problem accessing site with variable in URL

2010-07-12 Thread Amos Jeffries

Baird, Josh wrote:

Hi,

I have a Squid 2.6STABLE-21 (EL5) forward proxy that is having problems
with one site:

http://gw.vtrenz.net/?DPO95NI5KU

It looks like Squid is dropping the text in the URL after the ?,
causing the remote website to return the incorrect data:

1278944523.919   1223 172.26.103.175 TCP_MISS/500 1723 GET
http://gw.vtrenz.net/? - DIRECT/74.112.68.36 text/html

Is this normal for access_log's to drop variables like this, or is Squid
really requesting the URL without the additional text after the ??  Is
anyone else able to reproduce this?


Squid does not normally log the query string. It can be many KB long.
It still gets passed along in the transaction though.

Configure:  strip_query_terms off

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.5


Re: [squid-users] Deny facebook chat access for my network

2010-07-12 Thread Luis Daniel Lucio Quiroz
Le lundi 12 juillet 2010 08:57:36, Struzik Wojciech a écrit :
 try this
 
 http://www.buttonclicker.com/2010/06/04/block-facebook-chat-using-squid-acl
 /
 
 
 Maybe this will be helpfull too:
 http://www.geekride.com/index.php/block-gmail-chat-gtalk-google-talk-squid-
 proxy/
 
 
 On Mon, Jul 12, 2010 at 10:00 AM, Harish Pokharel
 
 harish.pokha...@gmail.com wrote:
  I have tried many possibilities for denying the chat access of
  facebook using squid but still unsuccessful. Is there any idea denying
  access to only chat section of facebook.
  
  Regards,
  Harish
You may do an acl like this:
acl FBchat url_regex -i ^http://0.channel..\.facebook.com
 
as my analisys says, everytime you try to send a message this acl is hit.
However dont forget that FB has already jabber support so you need to block 
jabber, but that is more easy with FW.

LD


[squid-users] Define some users only can access to some website

2010-07-12 Thread Donny Christiaan
Dear Expert,

I'm looking squid configuration to allow some user to access only listed urls.

Let say I would like to define for user donny only can access
www.yahoo.com
www.google.com
www.freshmeat.net

So user donny only can access listed (3 url) above not more than that.

I'm using:
- Squid Cache: Version 2.6.STABLE6
- mysql_auth
- SquidGuard

Best Regards,
Donny Christiaan.


Re: [squid-users] Define some users only can access to some website

2010-07-12 Thread Scott Horsley



On 13/07/10 1:16 PM, Donny Christiaan dchristi...@gmail.com wrote:

 Dear Expert,
 
 I'm looking squid configuration to allow some user to access only listed urls.
 
 Let say I would like to define for user donny only can access
 www.yahoo.com
 www.google.com
 www.freshmeat.net
 
 So user donny only can access listed (3 url) above not more than that.
 
 I'm using:
 - Squid Cache: Version 2.6.STABLE6
 - mysql_auth
 - SquidGuard
 
 Best Regards,
 Donny Christiaan.
 
 __
 This email has been scanned by the MessageLabs Email Security System.
 For more information please visit http://www.messagelabs.com/email
 __

Hi Donny,

This looks very similar to the same email you sent to this list last week.

This setup should work.

acl my_sites dstdomain www.yahoo.com www.google.com www.freshmeat.net
acl restricted_users proxy_auth donny

http_access allow restricted_users my_sites
http_access deny restricted_users

This setup should allow restricted_users to access my_sites, then deny
restricted_users from accessing anything else.

References:
http://www.visolve.com/squid/squid30/accesscontrols.php#acl
http://www.visolve.com/squid/squid30/accesscontrols.php#http_access

This is, however untested in a proper configuration but seems fairly
logical. This is also certainly not the only approach to this problem.

Scott


This email and any files transmitted with it are confidential and intended
 solely for the use of the individual or entity to whom they are addressed. 
Please notify the sender immediately by email if you have received this 
email by mistake and delete this email from your system. Please note that
 any views or opinions presented in this email are solely those of the
 author and do not necessarily represent those of the organisation. 
Finally, the recipient should check this email and any attachments for 
the presence of viruses. The organisation accepts no liability for any 
damage caused by any virus transmitted by this email.