[squid-users] Re: WCCP, Squid, ASA, HTTP redirect

2010-01-19 Thread samk
See Thread at: http://www.techienuggets.com/Detail?tx=33825 Posted on behalf of 
a User

wccp2 on cisco asa kan redirect 443. 

Don't use web-cache option but a number.

In Response To: 

I've googled and saw some stuff but nothing that I can really make sense of.

We have successfully designed (and its working) 2 squid transparent proxy 
servers, both WCCP to an ASA working as failover (if squid dies on one proxy 
the other one starts taking the redirects from the ASA). The only problem is 
that we cant figure out how to get HTTPS requests redirected from the ASA to 
the proxy (using WCCP). Does anyone know how this can happen? Do I need to use 
dynamic's instead of standards for WCCP? (Ive tried, without success).

I really cant imagine that all this WCCP with a web-cache can not work with 
HTTPS (that would suck)

- Nick



[squid-users] Re: squid fails

2008-05-01 Thread samk
See Thread at: http://www.techienuggets.com/Detail?tx=29460 Posted on behalf of 
a User

To ALL,

Dear Sir,



I have configured Virtual IP in my firewall to map external public ip to local 
internal Domino server,

but while connecting to http://210.7.71.137/ through browser it  is showing 
following error:-

 


ERROR


The requested URL could not be retrieved



While trying to retrieve the URL: http://210.7.71.137/ 

The following error was encountered: 

Connection to 210.7.71.137 Failed 
The system returned: 

(110) Connection timed outThe remote host or network may be down. Please 
try the request again. 

Your cache administrator is root. 





Generated Thu, 01 May 2008 07:34:27 GMT by bsdel-dhcp1.business-standard.com 
(squid/2.6.STABLE4)


Please help,

Rajib Patra,

CSE , 011-09313182571



In Response To: 

On Fri, 2008-03-21 at 04:42 +0100, troxlinux wrote:
> The following error was encountered:
> 
>* *Connection to 216.92.7.36 Failed *
> 
> The system returned:
> 
> /(110) Connection timed out/

This says that Squid started a connection to 216.92.7.36, but after 2
minutes (or whatever timeout your OS uses) the OS gave up on completing
the connection and reported "Error 110, connection timed out" back to
Squid.

So first guess would be a (temporary?) networking error, or that the
requested site is in fact down..

Regards
Henrik




[squid-users] Re: YouTube and other streaming media (caching)

2008-05-27 Thread samk
See Thread at: http://www.techienuggets.com/Detail?tx=32811 Posted on behalf of 
a User

We have about 600 users behind a squid 2.6stable20 proxy, and youtube 
represents a big chunk of our bandwidth. Will this method work in 2.6 or is 2.7 
needed? We tried to used 3.0 for a while, but suffered a proxy auth bug, and 
when that was fixed, it was unstable, so I went back to 2.6

Thanks.

In Response To: 

On Thu, Apr 17, 2008 at 08:11:51AM +0800, Adrian Chadd wrote:
> The problem with caching Youtube (and other CDN content) is that
> the same content is found at lots of different URLs/hosts. This
> unfortunately means you'll end up caching multiple copies of the
> same content and (almost!) never see hits.
> 
> Squid-2.7 -should- be quite stable. I'd suggest just running it from
> source. Hopefully Henrik will find some spare time to roll 2.6.STABLE19
> and 2.7.STABLE1 soon so 2.7 will appear in distributions.

Thanks Adrian.  FYI I got this to work with 2.7 (latest) based off the
instructions you provided earlier.  Here is my final config and the
perl script used to generate the storage URL:

  http_port 3128
  append_domain .esri.com
  acl apache rep_header Server ^Apache
  broken_vary_encoding allow apache
  maximum_object_size 4194240 KB
  maximum_object_size_in_memory 1024 KB
  access_log /usr/local/squid/var/logs/access.log squid

  # Some refresh patterns including YouTube -- although YouTube probably needs 
to
  # be adjusted.
  refresh_pattern ^ftp:   144020% 10080
  refresh_pattern ^gopher:14400%  1440
  refresh_pattern -i \.flv$   10080   90% 99 ignore-no-cache 
override-expire ignore-private
  refresh_pattern ^http://sjl-v[0-9]+\.sjl\.youtube\.com 10080 90% 99 
ignore-no-cache override-expire ignore-private
  refresh_pattern get_video\?video_id 10080 90% 99 ignore-no-cache 
override-expire ignore-private
  refresh_pattern youtube\.com/get_video\? 10080 90% 99 ignore-no-cache 
override-expire ignore-private
  refresh_pattern .   0   20% 4320

  acl all src 0.0.0.0/0.0.0.0
  acl esri src 10.0.0.0/255.0.0.0
  acl manager proto cache_object
  acl localhost src 127.0.0.1/255.255.255.255
  acl to_localhost dst 127.0.0.0/8
  acl SSL_ports port 443
  acl Safe_ports port 80  # http
  acl Safe_ports port 21  # ftp
  acl Safe_ports port 443 # https
  acl Safe_ports port 70  # gopher
  acl Safe_ports port 210 # wais
  acl Safe_ports port 1025-65535  # unregistered ports
  acl Safe_ports port 280 # http-mgmt
  acl Safe_ports port 488 # gss-http
  acl Safe_ports port 591 # filemaker
  acl Safe_ports port 777 # multiling http
  acl CONNECT method CONNECT
  # Some Youtube ACL's
  acl youtube dstdomain .youtube.com .googlevideo.com .video.google.com 
.video.google.com.au
  acl youtubeip dst 74.125.15.0/24 
  acl youtubeip dst 64.15.0.0/16
  cache allow youtube
  cache allow youtubeip
  cache allow esri

  # These are from http://wiki.squid-cache.org/Features/StoreUrlRewrite
  acl store_rewrite_list dstdomain mt.google.com mt0.google.com mt1.google.com 
mt2.google.com
  acl store_rewrite_list dstdomain mt3.google.com
  acl store_rewrite_list dstdomain kh.google.com kh0.google.com kh1.google.com 
kh2.google.com
  acl store_rewrite_list dstdomain kh3.google.com
  acl store_rewrite_list dstdomain kh.google.com.au kh0.google.com.au 
kh1.google.com.au
  acl store_rewrite_list dstdomain kh2.google.com.au kh3.google.com.au

  # This needs to be narrowed down quite a bit!
  acl store_rewrite_list dstdomain .youtube.com

  storeurl_access allow store_rewrite_list
  storeurl_access deny all

  storeurl_rewrite_program /usr/local/bin/store_url_rewrite

  http_access allow manager localhost
  http_access deny manager
  http_access deny !Safe_ports
  http_access deny CONNECT !SSL_ports
  http_access allow localhost
  http_access allow esri
  http_access deny all
  http_reply_access allow all
  icp_access allow all
  coredump_dir /usr/local/squid/var/cache

  # YouTube options.
  quick_abort_min -1 KB

  # This will block other streaming media.  Maybe we don't want this, but using
  # it for now.
  hierarchy_stoplist cgi-bin ?
  acl QUERY urlpath_regex cgi-bin \?
  cache deny QUERY

And here is the store_url_rewrite script.  I added some logging:

  #!/usr/bin/perl

  use IO::File;
  use IO::Socket::INET;
  use IO::Pipe;

  $| = 1;

  $fh = new IO::File("/tmp/debug.log", "a");

  $fh->print("Hello!\n");
  $fh->flush();

  while (<>) {
  chomp;
  #print LOG "Orig URL: " . $_ . "\n";
  $fh->print("Orig URL: " . $_ . "\n");
  if (m/kh(.*?)\.google\.com(.*?)\/(.*?) /) {
  print "http://keyhole-srv.google.com"; . $2 . 
".SQUIDINTERNAL/" . $3 . "\n";
  # print STDERR "KEYHOLE\n";
  } elsif (m/mt(.*?)\.google\.com(.*?)\/(.*?) /) {
  print "http://map-srv.google.com"; . $2 . ".SQUIDINTERNAL/" . 
$3 . "\n

[squid-users] Re: missing cachemgr.cgi

2008-08-17 Thread samk
See Thread at: http://www.techienuggets.com/Detail?tx=31484 Posted on behalf of 
a User

apt-get install squid-cgi


In Response To: 

Dear Squid-users,

This might be silly, but i cannot seem to find the cachemgr.cgi on my ubuntu 
7.10, i tried to find them using find / -names 'cachemgr.cgi', with no luck,  
did i make a mistake when compiling the squid source?

any help would be appreciated,

thanks in advance,

Regards,
Yudi




  

You rock. That's why Blockbuster's offering you one month of Blockbuster Total 
Access, No Cost.  
http://tc.deals.yahoo.com/tc/blockbuster/text5.com



[squid-users] Cannot get squid 2.6 in reverse-proxy to "not send cache when peer is dead"

2008-10-16 Thread samk
See Thread at: http://www.techienuggets.com/Detail?tx=56772 Posted on behalf of 
a User

All,

I really need help here, and this has got to be a real simple problem, just not 
easy to lay out for you all.

I am using Squid 2.6 as a reverse proxy for our webservers.
Our webservers get rebooted every night, and during that downtime, we send 
users to a "sorry server".
We are using a Cisco CSS device to route the traffic to the sorry server, when 
it detects that the webservers are down.

The problem I am having is that when the webservers go down, the Squid server 
is delivering content from it's cache instead of a 404.
This is causing half-loaded webpages instead of the SORRY SERVER page.


###CONFIG##
cache_peer 12.xxx.xxx.xxx parent 80 0 no-query originserver name=foo
acl sites_foo dstdomain www1.foobar.com www.foobar.com
cache_peer_access foo allow sites_foo
cache_peer_access foo deny all

acl foo_networks src 12.xxx.xxx.xxx/27
http_access allow foo_networks

http_port 12.xxx.xxx.xxx:80 accel defaultsite=www1.foobar.com





[squid-users] squid running problem with aufs

2008-11-14 Thread samk
See Thread at: http://www.techienuggets.com/Detail?tx=60849 Posted on behalf of 
a User

FATAL: Bungled squid.conf line 2: cache_dir aufs 
Squid Cache (Version 3.0.STABLE10): Terminated abnormally.
CPU Usage: 0.005 seconds = 0.001 user + 0.004 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 0


when i change aufs to ufs problem solved!!





[squid-users] Re: Accessing OWA2007 reverse-proxied by ISA Server

2008-11-22 Thread samk
See Thread at: http://www.techienuggets.com/Detail?tx=58290 Posted on behalf of 
a User

I think there is something wrong in this OWA server setup: 
Check with nslookup if OWA2007 have round robin active: 

C:\>nslookup mail.telecomitalia.it 
Server:  dns1.tiscaliia.it 
Address:  213.205.32.10 

Non-authoritative answer: 
Name:mail.telecomitalia.it 
Addresses:  156.54.233.103, 156.54.233.102 

If is active like my example use thi in your squid configuration (squid.conf): 
balance_on_multiple_ip off 

A round robin configuration for a OWA front-end is a wrong 
solution because OWA is a session based web application. 

Lodovico Bertolini 


In Response To: 

Hi folks,

I'm experiencing some trouble to acces an OWA 2007 server, located
behing an ISA reverse proxy, through our Squid 2.6ST18 proxy.

When I try to access it, IE or Firefox keeps waiting for data to
tranfer, the page stays blank and it nevers falls nor on time-out or
any other error.

Squid log shows lines that let think that there is some traffic transferred:

1225194824.927   1042 10.1.103.104 TCP_MISS/200 3088 CONNECT
OBFUSCATED_URL:443 - DIRECT/84.14.218.217 -
1225194825.050118 10.1.103.104 TCP_MISS/200 1056 CONNECT
OBFUSCATED_URL:443 - DIRECT/84.14.218.217 -
1225194825.185133 10.1.103.104 TCP_MISS/200 4524 CONNECT
OBFUSCATED_URL:443 - DIRECT/84.14.218.217 -
1225194825.367168 10.1.103.104 TCP_MISS/200 2371 CONNECT
OBFUSCATED_URL:443 - DIRECT/84.14.218.217 -
1225194849.719  24520 10.1.103.104 TCP_MISS/200 155 CONNECT
OBFUSCATED_URL:443 - DIRECT/84.14.218.217 -

Direct connexion (I mean bypassing our Squid proxy) to this website works fine.

As i had a similar problem some time ago on another OWA installation,
I tried to deactivate "Accept-Encoding" header in Squid Proxy but it
is the same problem.
Our Squid Installation is quite basic, we don't authenticate our
clients, only a small cache is done on the fly.

For those who would give a try, I pasted the URL at
http://pastebin.com/m44904a19 to prevent it from showing up in mailing
lists archives, as this server belongs to a partner.
Please do not paste it in your replies.

Thanks for any help.

Momo