Re: [squid-users] Firefox road warrior autoconfig

2009-06-13 Thread Amos Jeffries

Jose Ildefonso Camargo Tolosa wrote:

Hi!

On 6/14/09, Yan Seiner  wrote:

This is not really a squid question, but it is related and I can't find any
info on this.

 How does one go about configuring firefox to autoconfig a proxy when at
home and not when on the road?

 In other words, for a laptop, how do I use a proxy in a location where it's
required and not elsewhere without reconfiguring the browser?


If you are the sysadmin for the net that requires the proxy, you could
use proxy autoconfiguration.  This would require a combination of dhcp
and/or dns (depending on the network configuration).  I use to suffer
from this, I move from client to client, and every client have their
own proxy config, I'm used to reconfigure the browser for every place
I go.



Which is all documented here:
http://wiki.squid-cache.org/SquidFaq/ConfiguringBrowsers

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE15
  Current Beta Squid 3.1.0.8 or 3.0.STABLE16-RC1


Re: [squid-users] acl maxconn per file or url

2009-06-13 Thread Amos Jeffries

MonzT wrote:

Jeff Pang :

You could run another separated Apache for those huge files.
And set "MaxClients" in httpd.conf to a reasonable value.


Thanks Jeff for replying...
but I meant people downloading large files from internet sucking 
bandwidth as much as they can, not files from our internal server




maxconn only works on IPs.

collapsed_forwarding limits outbound connections to one per URL and 
shares the results with as many clients as are requesting it.


delay pools can be configured to use maxconn and object types to limit 
files over a certain size down to a low bandwidth. This has the added 
benefit of catching multiple files fro different sources under the one 
download cap.



Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE15
  Current Beta Squid 3.1.0.8 or 3.0.STABLE16-RC1


Re: [squid-users] [Repost] Querying and Extraction from a Squid Cache Directory

2009-06-13 Thread Amos Jeffries

Genaro Flores wrote:
Reposting this in the hope that someone considers it. Even if you don't 
have a definite answer or the answer is negative please do give me a 
short reply so that I know the question has been considered by someone. 
Thanks again.


1) its a weekend.

2) yes its getting to people, if anyone has an answer they will post it.

Amos




Dear List,

I am using the latest stable release of the native NT port of Squid. I
would like to know if there are tools for querying an existing cache
directory structure and for extracting desired original objects sans
headers. I was directed to ufsdump and cossdump on #sq...@freenode.net
but those don't seem to be available with the NT port and I couldn't find
online documentation for them so I am also at loss as to whether or not
they perform the tasks just described and whether NT ports exist. Please
kindly inform me of the existence and state of any such tools, preferably
for NT systems. Links to ufsdump documentation will also be somewhat
helpful.

Thanks in advance.



--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE15
  Current Beta Squid 3.1.0.8 or 3.0.STABLE16-RC1


Re: [squid-users] squid 3.0 stable 14 terminates abnormally

2009-06-13 Thread Amos Jeffries

goody goody wrote:

subject squid version running on freebsd 7 dies  and following messages is 
displayed.

assertion failed: HttpHeader.cc:1196: "Headers[id].type == ftInt64"


after search mailing list i found Amos's answer to wong asking to upgrade to 15 
or changes in src/HttpHeader.cc.

Trying Method-1 Apply latest patch.

now i have download the squid-3.0.STABLE15.patch and changed the pwd to the source files 
from where i had previously installed the stable 14 version, but when i apply this patch 
using command patch < /path/squid-3.0.STABLE15.patch, it successfully hunks some files 
and then stops and says "Hmm...  The next patch looks like a unified diff to 
me...".

So can any body tell me what should i do to continue


Hmm, not sure why that is failing.
The minimal patch on STABLE14 to get the headers going again is:
http://www.squid-cache.org/Versions/v3/3.0/changesets/b9001.patch

It's applied with "patch -p0 

On Trying Method-2 changes in src/HttpHeader.cc.

after changing the said line i-e 

{"Max-Forwards", HDR_MAX_FORWARDS, ftInt}, 
to become

{"Max-Forwards", HDR_MAX_FORWARDS, ftInt64},

i don't what to do further to tell squid adapt changes. should i run "make clean && make 
&& make install" and it would be done!!!.

Thanks in advance.
.Goody.



Yes that should be sufficient.

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE15
  Current Beta Squid 3.1.0.8 or 3.0.STABLE16-RC1


Re: [squid-users] acl maxconn per file or url

2009-06-13 Thread MonzT

Jeff Pang :

You could run another separated Apache for those huge files.
And set "MaxClients" in httpd.conf to a reasonable value.


Thanks Jeff for replying...
but I meant people downloading large files from internet sucking bandwidth 
as much as they can, not files from our internal server






Re: [squid-users] acl maxconn per file or url

2009-06-13 Thread Jeff Pang

MonzT:

Hi,
I want to know if there is any way to limit maximum connection per FILE 
or URL not just client ip.



You could run another separated Apache for those huge files.
And set "MaxClients" in httpd.conf to a reasonable value.


--
Jeff Pang
DingTong Technology
www.dtonenetworks.com


[squid-users] acl maxconn per file or url

2009-06-13 Thread MonzT

Hi,
I want to know if there is any way to limit maximum connection per FILE or 
URL not just client ip.
The usual 'maxconn' command limits connection by ip which is not what we 
want in most cases, because some browsers make multiple connections to get 
the whole page and this doesn't affect the bandwidth as these connections 
are for small objects,
the problem is when some one download a 1.5 GB file with 16 connection using 
IDM or Flashget or something,

I think limiting connection per file or requested url will be the best,
so I was wondering if there is any way to achieve this, or any hint to edit 
source code


PS : I know about delay pools but I don't want to sacrifice bandwidth, I 
just want users to share complete bandwidth fairly.


Thanks,
Monz




Re: [squid-users] Firefox road warrior autoconfig

2009-06-13 Thread Jose Ildefonso Camargo Tolosa
Hi!

On 6/14/09, Yan Seiner  wrote:
> This is not really a squid question, but it is related and I can't find any
> info on this.
>
>  How does one go about configuring firefox to autoconfig a proxy when at
> home and not when on the road?
>
>  In other words, for a laptop, how do I use a proxy in a location where it's
> required and not elsewhere without reconfiguring the browser?

If you are the sysadmin for the net that requires the proxy, you could
use proxy autoconfiguration.  This would require a combination of dhcp
and/or dns (depending on the network configuration).  I use to suffer
from this, I move from client to client, and every client have their
own proxy config, I'm used to reconfigure the browser for every place
I go.

>
>  --Yan
>
>  --
>  Yan Seiner
>
>


[squid-users] Firefox road warrior autoconfig

2009-06-13 Thread Yan Seiner
This is not really a squid question, but it is related and I can't find 
any info on this.


How does one go about configuring firefox to autoconfig a proxy when at 
home and not when on the road?


In other words, for a laptop, how do I use a proxy in a location where 
it's required and not elsewhere without reconfiguring the browser?


--Yan

--
Yan Seiner 





[squid-users] compiling squid in windows

2009-06-13 Thread Vicks

dear frnds,

i installed all the applications as per given on the page for compiling squid 
in windows. but i dnt know which file is to be executed or how to compile squid 
sources then. can ny1 help.

thnx

bye


  Explore and discover exciting holidays and getaways with Yahoo! India 
Travel http://in.travel.yahoo.com/



[squid-users] squid 3.0 stable 14 terminates abnormally

2009-06-13 Thread goody goody

subject squid version running on freebsd 7 dies  and following messages is 
displayed.

assertion failed: HttpHeader.cc:1196: "Headers[id].type == ftInt64"


after search mailing list i found Amos's answer to wong asking to upgrade to 15 
or changes in src/HttpHeader.cc.

Trying Method-1 Apply latest patch.

now i have download the squid-3.0.STABLE15.patch and changed the pwd to the 
source files from where i had previously installed the stable 14 version, but 
when i apply this patch using command patch < /path/squid-3.0.STABLE15.patch, 
it successfully hunks some files and then stops and says "Hmm...  The next 
patch looks like a unified diff to me...".

So can any body tell me what should i do to continue

On Trying Method-2 changes in src/HttpHeader.cc.

after changing the said line i-e 

{"Max-Forwards", HDR_MAX_FORWARDS, ftInt}, 
to become
{"Max-Forwards", HDR_MAX_FORWARDS, ftInt64},

i don't what to do further to tell squid adapt changes. should i run "make 
clean && make && make install" and it would be done!!!.

Thanks in advance.
.Goody.



  


[squid-users] [Repost] Querying and Extraction from a Squid Cache Directory

2009-06-13 Thread Genaro Flores
Reposting this in the hope that someone considers it. Even if you don't 
have a definite answer or the answer is negative please do give me a short 
reply so that I know the question has been considered by someone. Thanks 
again.



Dear List,

I am using the latest stable release of the native NT port of Squid. I
would like to know if there are tools for querying an existing cache
directory structure and for extracting desired original objects sans
headers. I was directed to ufsdump and cossdump on #sq...@freenode.net
but those don't seem to be available with the NT port and I couldn't find
online documentation for them so I am also at loss as to whether or not
they perform the tasks just described and whether NT ports exist. Please
kindly inform me of the existence and state of any such tools, preferably
for NT systems. Links to ufsdump documentation will also be somewhat
helpful.

Thanks in advance.