[squid-users] Issue with squid-2.5STABLE7

2005-01-12 Thread Deepa D
Hi,
   I am encountering a weird problem with
squid-2.5STABLE7. Squid is configured to be run in the
redirector mode with caching disabled completely. The
system works fine for sometime and later inspite of
the net connection being file, the following error msg
is displayed for all urls :-
   Unable to determine IP address from host name for
www.google.com 
   The dns server returned : Timeout
   The cache was not able to resolve the url presented
in the URL. Check if the address is correct.

   But at the same time using host to resolve the url
is working fine. We didn't encounter these problems
with squid-2.5STABLE5 and 6. Is this a known bug in
this version or have we goofed up some configuration?
   Kindly help me resolve this issue asap.
   Regards and TIA,
 Deepa


   


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


RE: [squid-users] Issue with squid-2.5STABLE7

2005-01-12 Thread Deepa D
Hi,
  Thanks for the response.
  When running squid in strace then was displaying -1,
EAGAIN(Resource temporarily unavailable). Also,
another command called host that I think uses the same
DNS server(/etc/resolv.conf on linux) was resolving
the urls correctly.
  When we revertedback to squid2.5.STABLE5, the
problem got resolved though.
  Kindly let me know what the problem could have been.
  Regards and TIA,
 Deepa
 


-- Elsen Marc [EMAIL PROTECTED] wrote: 
 
  
  
  Hi,
 I am encountering a weird problem with
  squid-2.5STABLE7. Squid is configured to be run in
 the
  redirector mode with caching disabled completely.
 The
  system works fine for sometime and later inspite
 of
  the net connection being file, the following error
 msg
  is displayed for all urls :-
 Unable to determine IP address from host name
 for
  www.google.com 
 The dns server returned : Timeout
 The cache was not able to resolve the url
 presented
  in the URL. Check if the address is correct.
  
 But at the same time using host to resolve the
 url
  is working fine. We didn't encounter these
 problems
  with squid-2.5STABLE5 and 6. Is this a known bug
 in
  this version or have we goofed up some
 configuration?
 Kindly help me resolve this issue asap.
  
 
  Seems like you may have a problem with your
 specified DNS
 resolver(s) as SQUID indicates.
 Post access.log entries for these none working url's
 at those time(s).
 This will probably confirm this.
 
 Make sure the specified DNS resolvers are functional
 at all times.
 
 M.
  


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


Re: [squid-users] cookies with redirector

2004-11-28 Thread Deepa D
Hi All,
Sorry for such a delayed response.
The redirector is only rewriting the url but even
then cookies are not retained while url rewriting is
happening.
Kindly tell me what the problem could be.
Regards and TIA,
  Deepa


 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
 On Fri, 29 Oct 2004, Deepa D wrote:
 
 I am using squid-2.5STABLE5 with redirectors. I
  have a requirement wherein when the redirector
 returns
  back a different url for squid to service the
 cookies
  received from the browser should also be passed to
  that request.
 
 Redirectors only change the URL, not cookies.
 
 Make sure your redirector do not return a browser
 redirect, it should just 
 rewrite the URL.
 
 Regards
 Henrik
  


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


Re: [squid-users] cookies with redirector

2004-10-30 Thread Deepa D
Hi,
   Thanks for the reply Henrik.
   Could you kindly elaborate the difference between
browser redirect and rewriting the url alone.
   Regards and TIA,
  Deepa


 -- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
 On Fri, 29 Oct 2004, Deepa D wrote:
 
 I am using squid-2.5STABLE5 with redirectors. I
  have a requirement wherein when the redirector
 returns
  back a different url for squid to service the
 cookies
  received from the browser should also be passed to
  that request.
 
 Redirectors only change the URL, not cookies.
 
 Make sure your redirector do not return a browser
 redirect, it should just 
 rewrite the URL.
 
 Regards
 Henrik
  


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


[squid-users] redirector exit

2004-10-29 Thread Deepa D
Hi All,
   I am using Squid-2.5STABLE5. I need a clarification
regarding one scenario - what happens to the request
that squid has assigned to a redirector but the
redirector exits for some reason? Will squid process
it assuming that a '\n' was returned by the
redirector, will the request be lost or will squid
assign the same request to another redirector?
   Could someone kindly help me understand this
scenario better.
   Regards and TIA,
 Deepa

 


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


[squid-users] cookies with redirector

2004-10-29 Thread Deepa D
Hi All,
I am using squid-2.5STABLE5 with redirectors. I
have a requirement wherein when the redirector returns
back a different url for squid to service the cookies
received from the browser should also be passed to
that request.
It doesn't seem to be happening now. Could
somebody kindly tell me how to do this.
Regards and TIA,
   Deepa




Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


RE: [squid-users] bypass redirector for some urls

2004-10-25 Thread Deepa D
Hi,
   Thanks for the response.
   I am using squid as a transparent non caching proxy
server. I could configure squid.conf to use
redirector_access for dstdomains mentioned in a file
or  mentioning the domain in the config file itself
but I am unable to do the same for url_regex. I am
pasting a sample below :-
   acl nr_urls url_regex ^http://www\.rediff\.com$
   acl nr_urls1 url_regex ^http://www\.yahoo\.com$
   redirector_access deny nr_urls nr_urls1
   
   Could somebody kindly tell me what is wrong with my
configuration - is the url regex pattern wrong? If
yes, kindly mail me a sample.
   Secondly kindly advise as to whether setting the
dstdomains is safer or specifying the url pattern is
better.
   Kindly mail back asap.
   Regards and TIA,
  Deepa


 --- Elsen Marc [EMAIL PROTECTED] wrote: 
 
  
  
  Hi All,
 I am using SQUID-2.5STABLE5 and have a
 requirement
  to bypass redirector for a few urls only. Could
  someone kindly tell me how to do this asap.
  
  Check the 'redirector_access' directive in
 squid.conf.default
  (and comments).
 
  M.
  


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


[squid-users] bypass redirector for some urls

2004-10-21 Thread Deepa D
Hi All,
   I am using SQUID-2.5STABLE5 and have a requirement
to bypass redirector for a few urls only. Could
someone kindly tell me how to do this asap.
   Regards and TIA,
  Deepa



Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


Re: [squid-users] Zombie sockets getting created

2004-10-08 Thread Deepa D
Hi,
   Thanks for the reply.
   Could you kindly elaborate as to what this cron job
would do.
   Regards and TIA,
 Deepa


 --- Shardul Adhikari [EMAIL PROTECTED]
wrote: 
 HI 
 You can try this 
 May be you can put in a cron or so 
 ps -auxwww |grep squid  |grep -v grep  | awk '{print
 $2}'|xargs kill 
 
 On Thu, 7 Oct 2004 15:18:07 +0100 (BST), Deepa D
 [EMAIL PROTECTED] wrote:
  Hi All,
 We are encountering a wierd issue with
  Squid-2.5STABLE5.  After some time, more than 7 to
 8
  hrs we see that the number of redirector clients
 are
  increasing( and this is more or less a sudden
  development) and most of them are not associated
 with
  any process. That is, we see lots of zombie
 sockets
  getting created. Because of this, squid fails to
  service requests after a while and only a restart
 of
  squid seems to solve this problem.
 Squid is being used as a transparent proxy with
  caching disabled and the redirector count is set
 to 5.
  But that count increased to as much as 64.
 The system configuration where squid is
 installed
  is as follows :-
 Linux 2.4.18-14 GNU/Linux
 Intel(R) Pentium(R) 4 CPU 2.40GHz
 RAM 512 MB
 Kindly tell me what the problem could be and
 plz
  suggest a solution as well.
 Regards and TIA,
   Deepa
  
  
 


  Yahoo! India Matrimony: Find your life partner
 online
  Go to: http://yahoo.shaadi.com/india-matrimony
 
  


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


Re: [squid-users] Zombie sockets getting created

2004-10-08 Thread Deepa D
Thanks for the reply Henrik. You are right, that
is not what I was looking for. What purpose would that
have served.
As per ur earlier suggestions, I checked with my
code to see if EOF was not being handled properly but
everything seems to be fine at that end. Also, -k
rotate and -k reconfigure are not being used as of
now. 
Could you kindly suggest any other probability of
such a situation arising.
   Regards and TIA,
Deepa


 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
 On Fri, 8 Oct 2004, Deepa D wrote:
 
Could you kindly elaborate as to what this cron
 job
  would do.
 
  ps -auxwww |grep squid  |grep -v grep  | awk
 '{print
  $2}'|xargs kill
 
 This would kill Squid and all Squid owned processes
 each time run. Better 
 expressed as killall squid; pkill -u squid.
 
 Not sure this is what you are looking for.
 
 Regards
 Henrik
  


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


[squid-users] Zombie sockets getting created

2004-10-07 Thread Deepa D
Hi All,
   We are encountering a wierd issue with
Squid-2.5STABLE5.  After some time, more than 7 to 8
hrs we see that the number of redirector clients are
increasing( and this is more or less a sudden
development) and most of them are not associated with
any process. That is, we see lots of zombie sockets
getting created. Because of this, squid fails to
service requests after a while and only a restart of
squid seems to solve this problem.
   Squid is being used as a transparent proxy with
caching disabled and the redirector count is set to 5.
But that count increased to as much as 64. 
   The system configuration where squid is installed
is as follows :-
   Linux 2.4.18-14 GNU/Linux
   Intel(R) Pentium(R) 4 CPU 2.40GHz
   RAM 512 MB
   Kindly tell me what the problem could be and plz
suggest a solution as well.
   Regards and TIA,
 Deepa

   
   


   
   


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


Re: [squid-users] Zombie sockets getting created

2004-10-07 Thread Deepa D
Hi,
   Thanks for the reply Henrik.
   I will try to explain the problem better :-
   Squid is running with our redirectors and the count
mentioned in the conf file is 5. It works fine for
quite sometime and later we notice that the number of
redirectors has increased to 64. When we run the
netstat -p command we notice that the PID/program name
   column for most of the redirectors is listed as
'-', implying that they are not associated with any
process.
After this squid fails to service any requests - no
error response is reported to the browser either but
all the requests are stuck. As we had disabled all
squid logging I am unable to give u any log related
info. I have enabled cache logging now though.  
   Our redirector is a TCP client that connects to a
TCP server running locally and passes the url
information to it. The server inturn processes the url
and returns the response as to whether the url should
be serviced or not and this info is in turn handed
over to squid.
   Kindly elaborate on redirector being broken. Plz
tell me what could be happening here. 
   Plz help me understand the problem and find a
solution.
   Regards and TIA,
 Deepa
   

 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
 On Thu, 7 Oct 2004, Deepa D wrote:
 
We are encountering a wierd issue with
  Squid-2.5STABLE5.  After some time, more than 7 to
 8
  hrs we see that the number of redirector clients
 are
  increasing( and this is more or less a sudden
  development) and most of them are not associated
 with
  any process. That is, we see lots of zombie
 sockets
  getting created.
 
 Can you explain with other words what you are
 seeing? I am not sure I 
 understand what you try to say here..
 
 What is number of redirector clients?
 
 What is a zombie socket?
 
 
 If you see that the number of redirector children to
 Squid increases and 
 that their parent process is not Squid then your
 redirector is broken.
 
  Because of this, squid fails to
  service requests after a while and only a restart
 of
  squid seems to solve this problem.
 
 Is there anything relevant said in cache.log?
 
 Regards
 Henrik
  


Yahoo! India Matrimony: Find your life partner online
Go to: http://yahoo.shaadi.com/india-matrimony


Re: [squid-users] Re: Squid in no cache mode

2004-06-20 Thread Deepa D
Hi, 
   Thanks for the reply.
   The h/w and system config is as follows :-
X86 Intel Processor :  1.7Ghx
Memory : 256 MB DDR RAM
Swap : 512 MB
800 MHz FSB
HDD : 40GB
RedHat Linux 8, Kernel : 2.4.18-14
Squid: 2.5.STABLE5 with redirector
Configure options : --enable-storeio=null
--enable-linux-netfilter
We have a web server with many applications, an
email client besides squid running on the system. The
redirector communicates with a remote server/ local
cache and determines the category of the url, which in
turn is checked with a local database bfr the response
is returned back to squid. After analysing many
tests's results we have come to the conclusion that
the redirector is consuming around 16% of the total
response time. I am unable to attach the squid.conf
file. The main options that I have changed are as
follows :-
   cache_dir null /usr/local/squid/var
   redirect_program
/usr/local/proj/cfilter/cfserver/cf_client
   redirect_children 5
   redirect_rewrites_host_header off
   half_closed_clients off
   httpd_accel_host virtual
   httpd_accel_port 80
   httpd_accel_with_proxy on
   httpd_accel_uses_host_header on
   logfile_rotate 0
   

   Kindly suggest some measures to improve the
performance.  Thanks for your time.
Regards and TIA,
   Deepa

   
   
   



--- Adam Aube [EMAIL PROTECTED] wrote:  Deepa D
wrote:
 
  I am running squid-2.5.STABLE5 in the no cache
 mode.
 
  I have done some tests and find that the response
 times are quite
  high(averaging to about 16secs per request but
 goes upto even 1.5
  mins in a few cases). I have logged the squid
 times from the
  access.log file and notice that squid's times are
 the
  major contributor to this.
 
  Could you kindly suggest some measures that I
 could use to improve
  squid's performance in the no cache mode.
 
 Could you give us some info on your hardware, your
 system setup, and your
 squid.conf (without blank lines or comments)?
 
 Adam
  


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


Re: [squid-users] Help needed wrt testing

2004-06-11 Thread Deepa D
Hi,
   Kindly clarify this doubt for me :-
   Is the time difference between the two times
entered in the access.log file the actual or
approximate time taken by squid to process that
request. If no, then how can I compute the time taken
by squid to process a request once it has received the
response back from the redirector - in the debug mode
what is the key words that I could look for.
Kindly mail back asap.
Regards and TIA,
Deepa
 

 --- Deepa D [EMAIL PROTECTED] wrote:  Hi,
I need to know the time taken by squid to service
 each url request. I am running squid in the no
 cache
 mode and hence the cache related statistics is not
 required. I am using redirectors with squid. I can
 get
 the details for the redirectors but not for squid.
 Could somebody kindly suggest another tool besides
 squidtimes that could give me the detailed time
 stats
 rather than the averages.
 Regards and TIA,
 Deepa
 
 
 --- Deepa D [EMAIL PROTECTED] wrote:  Hi,
 I want to use wget command in spider mode to
  test
  only squid(without redirector). I tried it with
 the
  proxy=on feature but I am getting connection
 refused
  for each url request.
 Could somebody kindly tell me how to go abt
  conducting this test.
 Regards and TIA,
Deepa
   
  
 


  Yahoo! India Matrimony: Find your partner online.
 http://yahoo.shaadi.com/india-matrimony/ 
 


 Yahoo! India Matrimony: Find your partner online.
http://yahoo.shaadi.com/india-matrimony/ 


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


[squid-users] Help needed wrt testing

2004-06-10 Thread Deepa D
Hi,
   I want to use wget command in spider mode to test
only squid(without redirector). I tried it with the
proxy=on feature but I am getting connection refused
for each url request.
   Could somebody kindly tell me how to go abt
conducting this test.
   Regards and TIA,
  Deepa
 


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


Re: [squid-users] Help needed wrt testing

2004-06-10 Thread Deepa D
Hi,
   I need to know the time taken by squid to service
each url request. I am running squid in the no cache
mode and hence the cache related statistics is not
required. I am using redirectors with squid. I can get
the details for the redirectors but not for squid.
Could somebody kindly suggest another tool besides
squidtimes that could give me the detailed time stats
rather than the averages.
Regards and TIA,
Deepa


--- Deepa D [EMAIL PROTECTED] wrote:  Hi,
I want to use wget command in spider mode to
 test
 only squid(without redirector). I tried it with the
 proxy=on feature but I am getting connection refused
 for each url request.
Could somebody kindly tell me how to go abt
 conducting this test.
Regards and TIA,
   Deepa
  
 


 Yahoo! India Matrimony: Find your partner online.
http://yahoo.shaadi.com/india-matrimony/ 


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


RE: [squid-users] FATAL: logfileWrite

2004-06-02 Thread Deepa D
 Note: forwarded message attached. 


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/---BeginMessage---
Hi,
   I would appreciate it if anybody could tell me why
such an error is being caused by the code change that
I have made as I don't get this FATAL error everytime
I run squid. What does the logFileWrite: fatal error
mean?
   Regards and TIA,
   Deepa

--- Elsen Marc [EMAIL PROTECTED] wrote:  
  
  
  Hi,
 I made the following changes to helper.c's
 Enqueue
  method :-
 //if (hlp-stats.queue_size  hlp-n_running *
 2)
  if (hlp-stats.queue_size  hlp-n_running *
 4)
  fatalf(Too many queued %s requests (%d on
  %d), hlp-id_name, hlp-stats.queue_size,
  hlp-n_running);
 Now,I am getting the following errors while
 running
  squid :-
  
  2004/06/02 10:47:25| WARNING: All redirector
 processes
  are busy.
  2004/06/02 10:47:25| WARNING: up to 2 pending
 requests
  queued
  2004/06/02 10:47:25| Consider increasing the
 number of
  redirector processes to at least 4 in your config
  file.
  2004/06/02 10:47:29| comm_write:
 fd_table[7].rwstate
  != NULL
  2004/06/02 10:47:34| comm_write:
 fd_table[6].rwstate
  != NULL
  2004/06/02 10:54:39| WARNING: All redirector
 processes
  are busy.
  2004/06/02 10:54:39| WARNING: up to 7 pending
 requests
  queued
  2004/06/02 10:54:39| Consider increasing the
 number of
  redirector processes to at least 9 in your config
  file.
  2004/06/02 10:54:55| storeDirWriteCleanLogs:
  Starting...
  2004/06/02 10:54:55|   Finished.  Wrote 0 entries.
  2004/06/02 10:54:55|   Took 0.0 seconds (   0.0
  entries/sec).
  FATAL: logfileWrite:
  /usr/local/squid/var/logs/access.log: (0) Success
  
  Squid Cache (Version 2.5.STABLE5): Terminated
  abnormally.
  CPU Usage: 2.410 seconds = 1.287 user + 1.123 sys
  Maximum Resident Size: 0 KB
  Page faults with physical i/o: 346
  Memory usage for squid via mallinfo():
  total space in arena:2304 KB
  Ordinary blocks: 2290 KB 84
 blks
  Small blocks:   0 KB  0
 blks
  Holding blocks:   192 KB  1
 blks
  Free Small blocks:  0 KB
  Free Ordinary blocks:  13 KB
  Total in use:2482 KB 108%
  Total free:13 KB 1%
  2004/06/02 10:54:58| Starting Squid Cache version
  2.5.STABLE5 for i686-pc-linux-gnu...
  2004/06/02 10:54:58| Process ID 21729
  2004/06/02 10:54:58| With 1024 file descriptors
  available
  
Please tell me if the code changes that I made
 is in
  any way responsible for this error. If no, then
 kindly
  tell me what what the problem could be. 
  
  
  Your question is remarkable because in the
 beginning of your post your state :
  'now I am getting the following errors when running
 squid'
 
  Doesn't that include the answer to your question
 automatically ?
 
  M. 


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/
---End Message---


Re: [squid-users] Too many queued redirector requests

2004-06-01 Thread Deepa D
Hi,
   Thanks for the response Hendrik.
   I looked into helper.c and in the Enqueue and
StatefulEnqueue methods, I noticed that a fatal error
is being thrown when the helper stats queue size is
more that twice the number of running redirectors. 
   Firstly, I want to know if I could change this to
multiples of 3 or 4. Will it affect the functioning
any where else? Secondly, what is the corrective
measure that is being applied by restarting squid in
case of such fatal errors.
   Plz help me understand this better.
   Regards and TIA,
   Deepa
  

--- Hendrik_Voigtländer [EMAIL PROTECTED]
wrote:  Hi
 
 Cache takes a considreable amt of memory which
 our
  system cannot afford, hence the choice of disbling
 the
  cache.
 
 Probably it is a better idea to get one or more of
 cheap intel-based 
 machines, 1GB should be affordable with this
 machines.
 
 The redirector that I am using is a self made
 one
  for content filtering. It communicates with a
 remote
  server to obtain the details of the content of the
  url. This communication alone could take a max of
  2000msecs.
 
 We have a servicetimes something like 300ms when
 under load, hit service 
 time is about 5ms. Adam Aube told me:
 Anything under ~ 1 seconds is probably fine for
 misses, and even up to 
 2 seconds depending on congestion and latency on
 your link.
 In your case the servicetime will be more than 2secs
 as you still need 
 to fetch that object after passing the redirector.
 That is awfully long.
 
  I cannot avoid this though. What is the average
 time
  for a redirector?
 
 Not sure about that as squidguards processing time
 is not measurable on 
 my machine: 0ms
 I would say that less than 10ms is very good, less
 than 100ms acceptable 
 - just a wild guess.
 
 What is the maximum queue length for the
  redirectors ? Is this a configurable parameter?
 
 No idea.
 
  I have
  seen FATAL errors being thrown for 22 on 10
  redirectors. Isn't this a very small number?
 
 It depends on the number of requests, not the
 clients. Ebay is a good 
 example for a site which use a lot of small objects,
 thus causing 
 clients to issue a lot of small requests in a short
 amount of time.
 
 Plz tell me if I can change this parameter some
  place and correct the problem. 
 Regards and TIA,
Deepa
 
 No idea either, but I think that would be turning
 the wrong knob anyway.
 IMHO you need to improve that processing time. Can
 you implement some 
 sort of caching?
 
 I dont know how perfect your remote content filter
 is and how crtitical 
 bypassed request are, but this may be another
 approach: Enable bypass 
 and increase the no of redirectors to find the
 minimum percentage of 
 bypassed requests.
 
 Regards, Hendrik Voigtländer 


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


[squid-users] FATAL: logfileWrite

2004-06-01 Thread Deepa D
Hi,
   I made the following changes to helper.c's Enqueue
method :-
   //if (hlp-stats.queue_size  hlp-n_running * 2)
if (hlp-stats.queue_size  hlp-n_running * 4)
fatalf(Too many queued %s requests (%d on
%d), hlp-id_name, hlp-stats.queue_size,
hlp-n_running);
   Now,I am getting the following errors while running
squid :-

2004/06/02 10:47:25| WARNING: All redirector processes
are busy.
2004/06/02 10:47:25| WARNING: up to 2 pending requests
queued
2004/06/02 10:47:25| Consider increasing the number of
redirector processes to at least 4 in your config
file.
2004/06/02 10:47:29| comm_write: fd_table[7].rwstate
!= NULL
2004/06/02 10:47:34| comm_write: fd_table[6].rwstate
!= NULL
2004/06/02 10:54:39| WARNING: All redirector processes
are busy.
2004/06/02 10:54:39| WARNING: up to 7 pending requests
queued
2004/06/02 10:54:39| Consider increasing the number of
redirector processes to at least 9 in your config
file.
2004/06/02 10:54:55| storeDirWriteCleanLogs:
Starting...
2004/06/02 10:54:55|   Finished.  Wrote 0 entries.
2004/06/02 10:54:55|   Took 0.0 seconds (   0.0
entries/sec).
FATAL: logfileWrite:
/usr/local/squid/var/logs/access.log: (0) Success

Squid Cache (Version 2.5.STABLE5): Terminated
abnormally.
CPU Usage: 2.410 seconds = 1.287 user + 1.123 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 346
Memory usage for squid via mallinfo():
total space in arena:2304 KB
Ordinary blocks: 2290 KB 84 blks
Small blocks:   0 KB  0 blks
Holding blocks:   192 KB  1 blks
Free Small blocks:  0 KB
Free Ordinary blocks:  13 KB
Total in use:2482 KB 108%
Total free:13 KB 1%
2004/06/02 10:54:58| Starting Squid Cache version
2.5.STABLE5 for i686-pc-linux-gnu...
2004/06/02 10:54:58| Process ID 21729
2004/06/02 10:54:58| With 1024 file descriptors
available

  Please tell me if the code changes that I made is in
any way responsible for this error. If no, then kindly
tell me what what the problem could be. 

  Regards and TIA,
  Deepa



Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


Re: [squid-users] Too many queued redirector requests

2004-05-31 Thread Deepa D
Hi,
   Thanks for the response. 
   Cache takes a considreable amt of memory which our
system cannot afford, hence the choice of disbling the
cache.
   The redirector that I am using is a self made one
for content filtering. It communicates with a remote
server to obtain the details of the content of the
url. This communication alone could take a max of
2000msecs.
I cannot avoid this though. What is the average time
for a redirector?
   What is the maximum queue length for the
redirectors ? Is this a configurable parameter? I have
seen FATAL errors being thrown for 22 on 10
redirectors. Isn't this a very small number?
   Plz tell me if I can change this parameter some
place and correct the problem. 
   Regards and TIA,
  Deepa


--- Hendrik_Voigtländer [EMAIL PROTECTED]
wrote:  Hi,
 
  I have been following this thread of mails and
 I
  have a problem with the number of redirectors too.
  I am using squid-2.5.STABLE5 configured to
 work
  with caching disabled.
 Just curious: Any reason to disable caching?
 
I am using a redirector that
  does some url filtering using a local database.
 
 What is the purpose? Filtering offending content?
 Selfmade redirector? What kind of database?
 
  I have
  timed the redirector - it takes anywhere between
 42
  msecs to 2000 msecs at times to process. I am
 running
  squid on a system with the configuration -
  Linux-2.4.18-14, P4, 512 MB RAM. We have other
  applications like a web server, etc that are also
  running on this system.
 
 2000msecs? Thats a lot...
 
  You had mentioned in ur earlier mail that ur
  system is so configured that 80% of the requests
 are
  handled by the first redirector, 10% by the second
 and
  so on. Could you kindly elaborate as to how
 this
  done - or is it the way squid works? If this is
 the
  case, then adding more redirectors shd not solve
 my
  problem. 
 
 The number of requests handled by each redirector as
 described in the 
 posts is more a phenomenon, not a configuration
 issue.
 - Squidguard is very fast and IMHO the request are
 not distributed round 
 robin, but to the first idle redirector
 - light load - the first redirector is always ready
 to serve the 
 request, all the others have never a chance to
 answer.
 - increased load - sometime the first redirector is
 busy, some requests 
 are answered by the 2nd redirector
 - even higher load -  sometime both 1st and 2nd
 redirectors are busy, 
 the 3th gets a chance to do some work - and so on
 with the others if 
 load increases.
 - load peak - all redirectors get something to do,
 some request are 
 queued and the warning message occurs, but never a
 FATAL error as the 
 queue lenght never gets high (long?) enough.
 
 I think your problem is different as your
 redirectors are quite slow, 
 the queues are growing to long which causes the
 fatal error.
 Increasing the number of redirectors should help a
 bit in your case, but 
   probably the database is to slow to handle all the
 requests. If the 
 database is the problem, increasing the no of
 redirector means 
 increasing the processing time as well - this means
 the number of 
 request which can be processed in a given amount of
 time will not 
 increase much.
 
  I tried conducting some simple load tests with
  squid using the redirectors. 
 1 redirector worked fine for 5 simultaneous
 browser
  clients(that is w/o throwing a FATAL error and
  restarting), 
 2 redirectors worked fine for 14 browser
 clients
but subsequent tests showed that even with 5
  redirector clients,20 browser clients cud not be
  handled simultaneously. I don't want to enable
  redirctor bypass though.
 I am failing to understand this behaviour. I
 would
  be thankful if u cud spare some time to explain
 what
  cud be happening here and tell me a solution for
 it.
 Regards and TIA,
 Deepa
 
  
   --- Hendrik_Voigtländer
 [EMAIL PROTECTED]
  wrote:  Hi,
  
 E250 with how many processor of what type?
 Probably you have an performance problem with the
 sleepycat berkeley-db.
 If all your processors are busy all the time
 increasing the number of 
 redirectors won't help.
 
 In this case I would switch over to Linux with an
 Intel machine. We have 
 replaced our old E450 with an HP/Compaq ML370
 (decent machine, but not 
 high end) with a significant improve in squid
 perfomance. I gave up on 
 compiling squidguard on solaris, to much hassle
 and
 to much load 
 probably as well for the 168MHz(!) processors.
 With debian no problem at all, 'apt-get install
 squidguard' and done...
 
 I really like the idea of using multiple cheap
 machine with 
 loadbalancing and failover, but IMHO you need to
 use
 automatic proxy 
 configuration to achieve this. I would use server
 hardware for this, but 
 something cheaper than HP/Compaq, for instance
 Supermicro.
 
 Get cachemgr.cgi running, it is really useful to
 look at squid  
 squidguards status.
 
 #  TAG: redirector_bypass
 #   When 

Re: [squid-users] Too many queued redirector requests

2004-05-30 Thread Deepa D
Hi Hendrik,
I have been following this thread of mails and I
have a problem with the number of redirectors too.
I am using squid-2.5.STABLE5 configured to work
with caching disabled. I am using a redirector that
does some url filtering using a local database. I have
timed the redirector - it takes anywhere between 42
msecs to 2000 msecs at times to process. I am running
squid on a system with the configuration -
Linux-2.4.18-14, P4, 512 MB RAM. We have other
applications like a web server, etc that are also
running on this system.
You had mentioned in ur earlier mail that ur
system is so configured that 80% of the requests are
handled by the first redirector, 10% by the second and
so on. Could you kindly elaborate as to how this
done - or is it the way squid works? If this is the
case, then adding more redirectors shd not solve my
problem. 
I tried conducting some simple load tests with
squid using the redirectors. 
   1 redirector worked fine for 5 simultaneous browser
clients(that is w/o throwing a FATAL error and
restarting), 
   2 redirectors worked fine for 14 browser clients
  but subsequent tests showed that even with 5
redirector clients,20 browser clients cud not be
handled simultaneously. I don't want to enable
redirctor bypass though.
   I am failing to understand this behaviour. I would
be thankful if u cud spare some time to explain what
cud be happening here and tell me a solution for it.
   Regards and TIA,
   Deepa
   

 --- Hendrik_Voigtländer [EMAIL PROTECTED]
wrote:  Hi,
 
 E250 with how many processor of what type?
 Probably you have an performance problem with the
 sleepycat berkeley-db.
 If all your processors are busy all the time
 increasing the number of 
 redirectors won't help.
 
 In this case I would switch over to Linux with an
 Intel machine. We have 
 replaced our old E450 with an HP/Compaq ML370
 (decent machine, but not 
 high end) with a significant improve in squid
 perfomance. I gave up on 
 compiling squidguard on solaris, to much hassle and
 to much load 
 probably as well for the 168MHz(!) processors.
 With debian no problem at all, 'apt-get install
 squidguard' and done...
 
 I really like the idea of using multiple cheap
 machine with 
 loadbalancing and failover, but IMHO you need to use
 automatic proxy 
 configuration to achieve this. I would use server
 hardware for this, but 
 something cheaper than HP/Compaq, for instance
 Supermicro.
 
 Get cachemgr.cgi running, it is really useful to
 look at squid  
 squidguards status.
 
 #  TAG: redirector_bypass
 #   When this is 'on', a request will not go
 through the
 #   redirector if all redirectors are busy.  If
 this is 'off'
 #   and the redirector queue grows too large,
 Squid will exit
 #   with a FATAL error and ask you to increase
 the number of
 #   redirectors.  You should only enable this if
 the redirectors
 #   are not critical to your caching system.  If
 you use
 #   redirectors for access control, and you
 enable this option,
 #   then users may have access to pages that
 they should not
 #   be allowed to request.
 #
 redirector_bypass on
 
 As you may have noticed it is impossible to filter
 100% of all unwanted 
 stuff, bypassing in high load situations won't make
 things much worse.
 
 Keep an eye on the redirector stats in cachemgr how
 many requests are 
 actually bypassing squidguard. In our setup it is
 less than 1%.
 
 Regards, Hendrik.
 
 
 Merid Tilahun wrote:
  Thanx Hendrik
  I am running squid on solaris 8, sun enterprise
 250
  machine. I have more that 500 users connect at
 peak
  hour. 
  I never got around to configure cachemanager.cgi,
 I
  will look in to that.
  I use squidguard to filter porn, and it seems to
 be
  working but it is affecting the servicetime badly.
 I
  run around 20 redirector processes, I have been
  increasing constantly.
  I deactivated squidguard for a while and I got not
  messages, but I need squidGuard to block the porn.
  What is redirector bypass and how do i enable it?
  
  --- Hendrik Voigtlaender
 [EMAIL PROTECTED]
  wrote:
  
 Hi,
 
 I just checked my logs:
 We get 40...50req/sec for about 10 hours a day
 with
 nearly no traffic 
 during the night, this levels out to a daily
 average
 of 1000req/min.
 Number of users are reported with about
 1000...1500,
 but I guess that we 
 have never more than 150 clients connect at the
 same
 time.
 We have 15 redirectors running, average redirector
 service time is not 
 measurable, servicetimes are good. In cachemgr.cgi
 you can check how 
 many request are handled by each redirector.
 In our setup, more than 80% of the requests are
 handled by the first 
 one. The second gets about 10%, all the other are
 used only during peak 
 periods.  Redirectors #15 is idle nearly the whole
 time.
 In top only a a few (2...3) squidguards are
 showing
 enough activity to 
 be noticed, but not much. Machine is Intel 2,4GHz,
 2x36GB SCSI 
 cache_dir, 2GB RAM, 

[squid-users] Optimum squid clients

2004-05-28 Thread Deepa D
Hi,
   I am conducting some simple load tests on
squid-2.5.STABLE5 using my redirector program that
does some url filtering which involves interacting
with a local database. I have observed that the squid
works fine for one squid redirector and 5 simultaneous
client requests. Is there some available diagnostics
based on which I could refer to check if my tests
results are approximately correct?
   Secondly, is there some kind of a formula or data
that I could use to calculate as to how many squid
clients canhandle how many simultaneous browser
requests.
   Regards and TIA,
  Deepa
  


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


Re: [squid-users] Re: Plz help... redirector problem

2004-05-18 Thread Deepa D
Hi,
   Thanks for the response. Our vendor has confirmed
that the loop back interface is not firewalled and nor
is there an implicit rule for it. 
   I have a doubt here - if it were to be firewalled,
wudn't it have caused a problem in the beginning
itself. Why is this problem occurring after random
intervals of time? Kindly clarify this doubt.
   I set the debug levels to 50,0 50,3 54,0 54,1 to
tarce if ipcCreate was failing but no error messages
happening there.Plz let me know if there cud be any
other possible causes for the error message - WARNING:
'Cannot run' redirector client.

   Regards and TIA,
  Deepa


 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
On Thu, 13 May 2004, Deepa D wrote:
 
  We have a firewall running but there is no policy
  that has been added for the loopback interface.
 
 Make sure that there is no implicit rule which block
 traffic on the
 loopback interface on your server. Communication
 must be allowed on the
 loopback interface for Squid to work properly.
 
  Squid is being started as root and the the
  cache_effective_user is the default user 'nobody'.
 
 Ok,
 
  Changing it to 'root' threw a fatal error.
 
 For good reasons.
 
  Kindly elaborate on what the problem could be.
 
 Try manually starting the helper as your
 cache_effective_user. man su if
 you do not know how.
 
 Regards
 Henrik
  


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


[squid-users] Plz help... redirector problem

2004-05-13 Thread Deepa D
Hi,
   The redirector code is a simple tcp client that is
connecting to a local tcp server and passing the data
that it receives from squid. It was working fine for
the past few days but now suddenly I am getting the
following messages in cache.log eventhough the local
tcp server is running fine :- 
   WARNING: redirector #3 (FD 8) exited
2004/05/13 14:53:24| WARNING: redirector #4 (FD 9)
exited
2004/05/13 14:53:24| WARNING: redirector #5 (FD 10)
exited
2004/05/13 14:53:24| WARNING: redirector #6 (FD 11)
exited
2004/05/13 14:53:25| WARNING: redirector #2 (FD 7)
exited
2004/05/13 14:53:25| Too few redirector processes are
running2004/05/13 14:53:25| Starting new helpers
2004/05/13 14:53:25| helperOpenServers: Starting 10
'cf_client' processes
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:25| WARNING: redirector #7 (FD 12)
exited
2004/05/13 14:53:25| Too few redirector processes are
running2004/05/13 14:53:25| storeDirWriteCleanLogs:
Starting...
2004/05/13 14:53:25|   Finished.  Wrote 0 entries.
2004/05/13 14:53:25|   Took 0.0 seconds (   0.0
entries/sec).
FATAL: The redirector helpers are crashing too
rapidly, need help!

Squid Cache (Version 2.5.STABLE5): Terminated
abnormally.
CPU Usage: 0.039 seconds = 0.016 user + 0.023 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 346
Memory usage for squid via mallinfo():
total space in arena:2204 KB
Ordinary blocks: 2189 KB 14 blks
Small blocks:   0 KB  0 blks
Holding blocks:   192 KB  1 blks
Free Small blocks:  0 KB
Free Ordinary blocks:  14 KB
Total in use:2381 KB 108%
Total free:14 KB 1%
2004/05/13 14:53:28| Starting Squid Cache version
2.5.STABLE5 for i686-pc-linux-gnu...
2004/05/13 14:53:28| Process ID 15872
2004/05/13 14:53:28| With 1024 file descriptors
available
2004/05/13 14:53:28| Performing DNS Tests...
2004/05/13 14:53:28| Successful DNS name lookup
tests...
2004/05/13 14:53:28| DNS Socket created at 0.0.0.0,
port 32779, FD 4
2004/05/13 14:53:28| Adding nameserver 10.10.10.10
from /etc/resolv.conf
2004/05/13 14:53:28| helperOpenServers: Starting 10
'cf_client' processes
2004/05/13 14:53:28| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:28| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:28| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:28| WARNING: Cannot run
'/usr/local/xnap/cfilter/cfclient/cf_client' process.
2004/05/13 14:53:28| Unlinkd pipe opened on FD 15
2004/05/13 14:53:28| Swap maxSize 0 KB, estimated 0
objects
2004/05/13 14:53:28| Target number of buckets: 0
2004/05/13 14:53:28| Using 8192 Store buckets
2004/05/13 14:53:28| Max Mem  size: 8192 KB
2004/05/13 14:53:28| Max Swap size: 0 KB
2004/05/13 14:53:28| Using Least Load store dir
selection
2004/05/13 14:53:28| chdir:
/usr/local/squid/var/cache: (2) No such file or
directory
2004/05/13 14:53:28| Current Directory is
/usr/local/squid/sbin
2004/05/13 14:53:28| Loaded Icons.
2004/05/13 14:53:28| commBind: Cannot bind socket FD
14 to *:3128: (98) Address already in use
FATAL: Cannot open HTTP Port
Squid Cache (Version 2.5.STABLE5): Terminated
abnormally.
CPU Usage: 0.027 seconds = 0.016 user + 0.012 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 330
Memory usage for squid via mallinfo():
total space in arena:2052 KB
Ordinary blocks: 2042 KB  2 blks
Small blocks:   0 KB  1 blks
Holding blocks:   192 KB  1 blks
Free Small blocks:  0 KB
Free Ordinary blocks:   9 KB
Total in use:2234 KB 109%
Total free: 9 KB 0%
2004/05/13 14:53:31| Starting Squid Cache version
2.5.STABLE5 for i686-pc-linux-gnu...
2004/05/13 14:53:31| Process ID 15885
2004/05/13 14:53:31| With 1024 file descriptors
available
2004/05/13 14:53:31| Performing DNS Tests...

[squid-users] Transparent proxy doubt

2004-05-08 Thread Deepa D
Hi All,
 I am using squid to run as a transparent proxy
for traffic heading to port 80. From the documentation
I understand that in such a case, the HTTP Accelerator
runs on port 80 and hence I can't reuse the same port
for my local web server(eg:- Jetty).
 But I want to run Jetty on port 80. How can I
manage such a case?
 Secondly, how will the https traffic and traffic
heading to other ports be handled using the
transparent proxy feature?
Kindly help me understand this better.

  Regards and TIA,
  Deepa

 


Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


[squid-users] A weird problem

2004-05-06 Thread Deepa D
Hi,
   Squid stable version 5 was working fine with the
redirector for quite sometime but then  suddenly one
day, the request from squid was not reaching the
redirector. The squid's log(in debug mode) said that
it had passed on the request to the
redirector(helperSubmit message) but the redirector
was not logging and message and the squid request was
getting hung.
   I tried reinstalling the whole thing but the
problem persisted. A fresh installation on a similar
system configuration worked though. 
   Squid is set up on RedHat linux 8.0 and gcc 3.2.
The OS was not tampered with either in the meanwhile.
Could you kindly tell me what the problem could be.
Does Squid have any system level dependencies?

  Regards,
 Deepa



Yahoo! India Matrimony: Find your partner online. 
http://yahoo.shaadi.com/india-matrimony/


[squid-users] Plz help... urls problem

2004-03-01 Thread Deepa D
Hi,
I have collected some info from squid and the
redirector program wrt the malformed urls problem. It
is a huge mail, kindly spare some time to read
through. 
Making a premise, the redirector code comprises of
two modules - the client is the one that is started by
squid. It fwds the requests to the redirector server
that takes care of processing the request. It then
sends the response to the client which inturn fwds the
response to the squid.
   The bug is that malformed urls are getting
generated. The squid log for debug_options ALL,1 61,9
33,5 is also pasted below :-

1)
 Client :-
 12304 Thu Feb 26 18:11:04 2004:  Client - read from
stdin =  http://www.dictionary.com/
10.10.10.106/bhadra - GET
   12305 Thu Feb 26 18:11:04 2004:  Client - wrote to
server =  http://www.dictionary.com/
10.10.10.106/bhadra - GET
   12306 Thu Feb 26 18:11:04 2004:  Client - the value
read from server = 
http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
10.10.10.106 mani GET

 Server :-
 72060  Thu Feb 26 18:11:04 2004:  Server - the value
read =  http://www.dictionary.com/ 10.10.10.106/bhadra
- GET
  72061
  72062 Thu Feb 26 18:11:04 2004:  In loadInBuff, buff
=  http://www.dictionary.com/ 10.10.10.106/bhadra -
GET
  72098 Thu Feb 26 18:11:04 2004:  in_buff.url = 
http://www.dictionary.com/
  72100 Thu Feb 26 18:11:04 2004:  Server - wrote to
client = 
http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
10.10.10.106 mani GET
  
 Cache.log :-
  2004/02/26 18:11:04| parseHttpRequest: Method is
'GET'
  2004/02/26 18:11:04| parseHttpRequest: URI is
'http://www.dictionary.com/'
  2004/02/26 18:11:04| clientSetKeepaliveFlag:
http_ver = 1.0
  2004/02/26 18:11:04| clientSetKeepaliveFlag: method
= GET
  2004/02/26 18:11:04| The request GET
http://www.dictionary.com/ is ALLOWED, because it
matched 'all'
  2004/02/26 18:11:04| redirectStart:
'http://www.dictionary.com/'
  2004/02/26 18:11:04| redirectHandleRead:
{http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
10.10.10.106 mani GET}
  2004/02/26 18:11:04| clientRedirectDone:
'http://www.dictionary.com/'
result=http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
 
 2) 
 Client :-
 12307 Thu Feb 26 18:11:04 2004:  Client - read from
stdin = 
http://update.messenger.yahoo.com/msgrcli.html
10.10.10.109/sharavathi - GET
   12308 Thu Feb 26 18:11:04 2004:  Client - wrote to
server = 
http://update.messenger.yahoo.com/msgrcli.html
10.10.10.109/sharavathi - GET
   12309 Thu Feb 26 18:11:05 2004:  Client - read from
stdin =  ñ^RBñ^RBww.dictionary.com/css/console.css
10.10.10.106/bhadra - GET
   12310 Thu Feb 26 18:11:05 2004:  Client - wrote to
server = 
ñ^RBñ^RBww.dictionary.com/css/console.css
10.10.10.106/bhadra - GET
   12311 Thu Feb 26 18:11:05 2004:  Client - the value
read from server =
   12312 Thu Feb 26 18:11:05 2004:  Client - the value
read from server = 
http://localhost:8080/contentfilter/login1.jsp?url=(ñ^RBñ^RBww.dictionary.com/css/console.css)ip=10.10.10.106
10.10.10.106 mani GET
 
 Server :-
 72708  Thu Feb 26 18:11:04 2004:  Server - the value
read =  http://update.messenger.yahoo.com/msgrcli.html
10.10.10.109/sharavathi - GET
  72710 Thu Feb 26 18:11:04 2004:  In loadInBuff, buff
=  http://update.messenger.yahoo.com/msgrcli.html
10.10.10.109/sharavathi - GET
  72134 Thu Feb 26 18:11:04 2004:  in_buff.url = 
http://update.messenger.yahoo.com/msgrcli.html
  72135 Thu Feb 26 18:11:04 2004:  After doAuth,
in_buff.url =  http://update.me   
ssenger.yahoo.com/msgrcli.html
  Thu Feb 26 18:11:05 2004:  Allowed , wrote to client
=

 Cache.log :-
 2004/02/26 18:11:04| parseHttpRequest: Method is
'GET'
 2004/02/26 18:11:04| parseHttpRequest: URI is
'http://update.messenger.yahoo.com/msgrcli.html'
 2004/02/26 18:11:04| clientSetKeepaliveFlag: http_ver
= 1.0
 2004/02/26 18:11:04| clientSetKeepaliveFlag: method =
GET
 2004/02/26 18:11:04| The request GET
http://update.messenger.yahoo.com/msgrcli.html is
ALLOWED, because it matched 'all'
 2004/02/26 18:11:04| The request GET
http://update.messenger.yahoo.com/msgrcli.html is
ALLOWED, because it matched 'all'
 2004/02/26 18:11:04| redirectStart:
'http://update.messenger.yahoo.com/msgrcli.html'
 2004/02/26 18:11:04| clientSendMoreData:
http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106,
3881 bytes
 2004/02/26 18:11:04| clientSendMoreData: FD 16
'http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106',
out.offset=0 
 2004/02/26 18:11:04| clientBuildReplyHeader: can't
keep-alive, unknown body size
 2004/02/26 18:11:04| clientSendMoreData: Appending
3584 bytes after 297 bytes of headers
 2004/02/26 18:11:04| The reply for GET
http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
is ALLOWED, because it matched 'all'
 

Re: [squid-users] Re: Malformed Urls

2004-02-22 Thread Deepa D
Hi,
   Thanks for the hint.
   I am pasting below the strace output for both squid
and the redirector :-

Squid's trace :-
read(16, GET http://us.f411.mail.yahoo.co;..., 4095)
= 2297
 read(6, \n\n, 8192)   = 2
 write(6,
h\312)\10\334\261\22Bs.f411.mail.yahoo.com/li...,
87) = 87
 write(14, GET /lib_web/inbox_views_scripts...,
2366) = 2366
 read(14, HTTP/1.1 304 Not Modified\r\nDate:...,
87380) = 114
 write(16, HTTP/1.0 304 Not Modified\r\nDate:...,
152) = 152
 write(5, 1077518955.788   1581 10.10.10.1..., 142)
= 142
 read(16, 0x83cef48, 4095)   = -1 EAGAIN
(Resource temporarily unavailable)

Redirector's trace :-

read(0,
h\312)\10\334\261\22Bs.f411.mail.yahoo.com/li...,
4096) = 87
write(3, Mon Feb 23 12:19:14 2004:  Clien..., 142) =
142
write(3,
h\312)\10\334\261\22Bs.f411.mail.yahoo.com/li...,
87) = 87
write(4, Mon Feb 23 12:19:14 2004:  Clien..., 142) =
142
read(3, \n, 1000) = 1
write(4, Mon Feb 23 12:19:14 2004:  Clien..., 67) =
67
write(1, \n, 1)   = 1
write(1, \n, 1)   = 1
read(0,
\334\261\22B\334\261\22Bs.js1.yimg.com/us.yimg.c...,
4096) = 78
write(3, Mon Feb 23 12:19:15 2004:  Clien..., 133) =
133
write(3,
\334\261\22B\334\261\22Bs.js1.yimg.com/us.yimg.c...,
78) = 78
write(4, Mon Feb 23 12:19:15 2004:  Clien..., 133) =
133
read(3, \n, 1000) = 1
write(4, Mon Feb 23 12:19:15 2004:  Clien..., 67) =
67
write(1, \n, 1)   = 1
write(1, \n, 1)   = 1

   From the above, I notice that squid is writing a
malformed url which is in turn being read by the
redirector thread.
   If my inference is right, then plz tell me what the
solution for this problem is. Else tell me where the
problem could be.
   Another problem that I face is, once such a
malformed url appears then all the subsequent requests
also fail.

   Regards and TIA,
  Deepa


--- Henrik Nordstrom [EMAIL PROTECTED] wrote:   
  Secondly, I did an strace on the redirector
 program
  but am not getting any output - plz tell me how to
 do
  it.
 
 You need to strace the first redirector as this is
 the one most likely to 
 receive the request.
 
 To make it easier to locate the correct redirector
 process limit the 
 number of redirectors to 1 during the test.
 
 Regards
 Henrik
  


Yahoo! India Insurance Special: Be informed on the best policies, services, tools and 
more. 
Go to: http://in.insurance.yahoo.com/licspecial/index.html


Re: [squid-users] Re: Malformed Urls

2004-02-21 Thread Deepa D
 Hi,
   Thanks for the response. 
   As the redirector threads are started by squid, I
did a strace on it.
   I noticed the following in the strace output in
case of squid when malformed urls are getting
generated :-
 Case 1)
write(12, 1077360367.332 RELEASE -1 FF..., 158)
= 158
read(16, GET http://java.sun.com/notfound;..., 4095)
= 365
read(6, \n\n, 8192)   = 2
write(6,
\330\313(\10\334\261\22Bava.sun.com/notfound.jsp...,
111) = 111
write(17, GET /notfound.jsp?requrl=/css/co..., 444)
= 444
read(17, HTTP/1.1 404 Not found\r\nServer: ...,
87380) = 781
write(16, HTTP/1.0 404 Not Found\r\nServer: ...,
814) = 814
read(16, , 4095)  = 0
read(17, !-- BEGIN VNV1 MASTHEAD COMPONE..., 87380)
= 1460
read(16, , 4095)  = 0
write(16, !-- BEGIN VNV1 MASTHEAD COMPONE..., 1460)
= 1460
read(17, \ \ //td\ntd valign=\top\ cla...,
87380) = 1460
read(16, , 4095)  = 0
write(16, \ \ //td\ntd valign=\top\ cla...,
1460) = -1 EPIPE (Broken pipe)
--- SIGPIPE (Broken pipe) ---
write(5, 1077360368.030693 10.10.10.1..., 123) =
123
read(16, , 87380) = 0
write(12, 1077360368.031 RELEASE -1 FF..., 160)
= 160
read(16, GET http://java.sun.com/images/1;..., 4095)
= 315
read(6, http://localhost:8080/contentfil;..., 8192) =
178
write(6, \3408\10://java.sun.com/images/1x1.g...,
61) = 61
write(17, GET /contentfilter/login1.jsp?ur..., 511)
= 511
read(18, GET http://216.239.57.99/search?;..., 4095)
= 385
read(6, http://localhost:8080/contentfil;..., 8192) =
128
write(6, http://216.239.57.99/search?clie;..., 182) =
182
write(19, GET /contentfilter/login1.jsp?ur..., 440)
= 440
read(17, HTTP/1.1 200 OK\r\nDate: Sat, 21 F...,
87380) = 3879
write(16, HTTP/1.0 200 OK\r\nDate: Sat, 21 F...,
3912) = 3912
read(17,td \n   ...,
87380) = 1349
write(16,td \n   ...,
1349) = 1349
read(16, 0x83d89a8, 4095)   = -1
ECONNRESET (Connection reset by peer)

Case 2)
write(12, 1077360407.457 RELEASE -1 FF..., 174)
= 174
read(16, GET http://localhost:8080/conten;..., 4095)
= 348
read(6, http://localhost:8080/contentfil;..., 8192) =
141
write(6,
\334\261\22B\334\261\22Bocalhost:8080/contentfil...,
78) = 78
write(17, GET /contentfilter/login1.jsp?ur..., 489)
= 489
read(17, HTTP/1.1 200 OK\r\nDate: Sat, 21 F...,
87380) = 3746
write(16, HTTP/1.0 200 OK\r\nDate: Sat, 21 F...,
3779) = 3779
read(17,td \n   ...,
87380) = 1349
write(16,td \n   ...,
1349) = 1349
read(17, , 87380) = 0
write(5, 1077360407.540 44 10.10.10.1..., 138) =
138
read(16, 0xbffe7e90, 87380) = -1 EAGAIN
(Resource temporarily unavailable)
write(12, 1077360407.540 RELEASE -1 FF..., 178)
= 178
read(16, GET http://localhost:8080/conten;..., 4095)
= 347
read(6, http://localhost:8080/contentfil;..., 8192) =
145
write(6,
\350g\10\334\261\22Bocalhost:8080/contentfil...,
77) = 77
write(17, GET /contentfilter/login1.jsp?ur..., 494)
= 494
read(17, HTTP/1.1 200 OK\r\nDate: Sat, 21 F...,
87380) = 3746
write(16, HTTP/1.0 200 OK\r\nDate: Sat, 21 F...,
3779) = 3779
read(17,td \n   ...,
87380) = 1349
write(16,td \n   ...,
1349) = 1349
read(16, 0x83cc5a8, 4095)   = -1
ECONNRESET (Connection reset by peer) 

   From this I observe that squid is reading twice and
writing twice. Why does this happen? Is it because the
client is writing to it twice?
   Secondly, I did an strace on the redirector program
but am not getting any output - plz tell me how to do
it.

  Regards and TIA,
   Deepa
 
  

--- Henrik Nordstrom [EMAIL PROTECTED] wrote:  On
Fri, 20 Feb 2004, Deepa D wrote:
 
  Hi All,
 Using strace I have noticed the following
 problem
  :-
  For the mentioned url, I get the HTTP/1.0 204
  response at one time. But when the sam request is
  given again, the request is serviced and the page
  displays properly. 
   read(16, GET
 http://www.google.com/url?sa;...,
  4095) = 674
  write(6, http://www.google.com/url?sa=Ts;...,
 121) =
  121
  read(6, \n\n, 8192)   = 2
 
 I asked you to strace the redirector, not Squid.
 
 as for the 204 question, see output of ngrep or
 other network sniffer to 
 see if the traffic between Squid and the server
 differs in any way between 
 the two requests.
 
 But what can be said is that in the above the URL is
 sent correctly to the 
 redirector (the write 6, ...) and if this got logged
 as a malformed URL by 
 your redirector then the redirector is broken.
 
 Regards
 Henrik
  


Yahoo! India Insurance Special: Be informed on the best policies, services, tools and 
more. 
Go to: http://in.insurance.yahoo.com/licspecial/index.html


[squid-users] Re: Malformed Urls

2004-02-20 Thread Deepa D
Hi All,
   Using strace I have noticed the following problem
:-
For the mentioned url, I get the HTTP/1.0 204
response at one time. But when the sam request is
given again, the request is serviced and the page
displays properly. 
 read(16, GET http://www.google.com/url?sa;...,
4095) = 674
write(6, http://www.google.com/url?sa=Ts;..., 121) =
121
read(6, \n\n, 8192)   = 2
write(17, GET /url?sa=Tstart=5url=http%3..., 751)
= 751
read(17, HTTP/1.0 204 No Content\r\nCache-c...,
87380) = 173
write(16, HTTP/1.0 204 No Content\r\nCache-C...,
206) = 206
write(5, 1077176973.319   1100 10.10.10.1..., 116) =
116
write(12, 1077176973.319 RELEASE -1 FF..., 149)
= 149
read(16, 0x82937e0, 4095)   = -1 EAGAIN
(Resource temporarily unavailable)
read(16, 0xbffe7fc0, 87380) = -1
ECONNRESET (Connection reset by peer)

   Plz let me know as to why this happens.
   
  Regards and TIA,
Deepa


 --- Deepa D [EMAIL PROTECTED] wrote:  Hi,
   Thanks for the info.
   I am trying to do an strace. In the meanwhile, I
 am
 sending the log function for ur perusal :-
 void log(char *filename, char *msg , char * msg1) {
   FILE *log;
   char *date_str = NULL;
   log = fopen(filename, at);
   if(log == NULL){
   return;
   }
   date_str = getDate();
   fprintf(log, %s: %s %s\n, date_str, msg, msg1);
   fflush(log);
   free(date_str);
   fclose(log);
 } 
 
 char *getDate(void) {
   time_t tp;
   char *ascitime;
   char *s;
   tp = (time_t)time(NULL);
   ascitime = (char *)ctime(tp);
   s = (char *)malloc(sizeof(char) *
 (strlen(ascitime)+1));
   /* no use writing an error message, because this
 function
   will keep getting called! */
   if(s == NULL) {
 exit(3);
   }
   strcpy(s, ascitime);
   s[strlen(ascitime) - 1] = '\0';
   return s;
 }
 
Another problem that I am now facing is that
 eventhough the redirector program is writing a new
 line to the stdout(I figure this out from the log
 message), the squid is redirecting the page to the
 redirect url.
Plz tell me if u know the solution to this
 problem.
 
Regards and TIA,
   Deepa
 
  --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
 On Wed, 18 Feb 2004, Deepa D wrote:
  
   Hi,
 Thanks for the response. 
 access.log is listing the urls correctly.
 Sample
  :-
  
 

http://in.yimg.com/i/in/adv/hp/pbhp_84x28_blu_yahoo.gif
   
 The redirector code is as follows :-
   
   char buff[MAX_BUFF] = ;
   setbuf(stdout, NULL);
 memset(buff,'\0',MAX_BUFF);
   
 while(fgets(buff, MAX_BUFF, stdin) != NULL) {
   
log(LOG_INFO, Client - read from stdin =
 ,
   buff);
   }
   
  
  What does the log function look like?
  
  Also try strace/truss of the redirector process to
  verify that what it 
  logs matches what it reads from Squid.
  
  Regards
  Henrik



Yahoo! India Insurance Special: Be informed on the best policies, services, tools and 
more. 
Go to: http://in.insurance.yahoo.com/licspecial/index.html


[squid-users] Re: Malformed urls

2004-02-18 Thread Deepa D
Hi,
  Thanks for the response. 
  access.log is listing the urls correctly. Sample :-
http://in.yimg.com/i/in/adv/hp/pbhp_84x28_blu_yahoo.gif

  The redirector code is as follows :-

char buff[MAX_BUFF] = ;
setbuf(stdout, NULL);
  memset(buff,'\0',MAX_BUFF);

  while(fgets(buff, MAX_BUFF, stdin) != NULL) {

 log(LOG_INFO, Client - read from stdin = ,
buff);
}

The log displays the value read from stdin as follows
:-
 
ܱBܱBn.yimg.com/i/in/adv/hp/pbhp_84x28_blu_yahoo.gif
  Ø^ܱBn.yimg.com/i/in/adv/java/ct_012004.js

  Such behaviour is not consistent either.Plz tell me
how this problem could be solved.

  Regards and TIA,
 Deepa


 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
On Wed, 18 Feb 2004, Deepa D wrote:
 
  Hi,
 Since the past two days, squid is passing lots
 of
  malformed urls to the redirector program. Samples
 are
  as follows :-

 

^Hý5^Hñ^RBongs.tamilsongs.net/songs/17/eahawtawkea/1599.rm
 

^Hý5^Hñ^RBongs3.tamilsongs.net/songs/17/eahawtawkea/1599.rm
 

^Hý5^Hñ^RBongs.tamilsongs.net/songs/17/eahawtawkea/1601.rm
 

^Hý5^Hñ^RBongs3.tamilsongs.net/songs/17/eahawtawkea/1601.rm
 

^Hý5^Hñ^RBongs.tamilsongs.net/songs/17/eahawtawkea/2473.rm
 

hÃ!^Hñ^RBww.google.com/search?sourceid=navclientq=javascript%3Ahistory
 
 
 What does access.log say, and how do you know this
 is what Squid passed to 
 the redirector and not the redirector who reads the
 URL wrongly?
 
 Regards
 Henrik
  


Yahoo! India Insurance Special: Be informed on the best policies, services, tools and 
more. 
Go to: http://in.insurance.yahoo.com/licspecial/index.html


[squid-users] Re: Malformed urls

2004-02-18 Thread Deepa D
Hi,
  Thanks for the info.
  I am trying to do an strace. In the meanwhile, I am
sending the log function for ur perusal :-
void log(char *filename, char *msg , char * msg1) {
  FILE *log;
  char *date_str = NULL;
  log = fopen(filename, at);
  if(log == NULL){
  return;
  }
  date_str = getDate();
  fprintf(log, %s: %s %s\n, date_str, msg, msg1);
  fflush(log);
  free(date_str);
  fclose(log);
} 

char *getDate(void) {
  time_t tp;
  char *ascitime;
  char *s;
  tp = (time_t)time(NULL);
  ascitime = (char *)ctime(tp);
  s = (char *)malloc(sizeof(char) *
(strlen(ascitime)+1));
  /* no use writing an error message, because this
function
  will keep getting called! */
  if(s == NULL) {
exit(3);
  }
  strcpy(s, ascitime);
  s[strlen(ascitime) - 1] = '\0';
  return s;
}

   Another problem that I am now facing is that
eventhough the redirector program is writing a new
line to the stdout(I figure this out from the log
message), the squid is redirecting the page to the
redirect url.
   Plz tell me if u know the solution to this problem.

   Regards and TIA,
  Deepa

 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
On Wed, 18 Feb 2004, Deepa D wrote:
 
  Hi,
Thanks for the response. 
access.log is listing the urls correctly. Sample
 :-
 

http://in.yimg.com/i/in/adv/hp/pbhp_84x28_blu_yahoo.gif
  
The redirector code is as follows :-
  
  char buff[MAX_BUFF] = ;
  setbuf(stdout, NULL);
memset(buff,'\0',MAX_BUFF);
  
while(fgets(buff, MAX_BUFF, stdin) != NULL) {
  
   log(LOG_INFO, Client - read from stdin = ,
  buff);
  }
  
 
 What does the log function look like?
 
 Also try strace/truss of the redirector process to
 verify that what it 
 logs matches what it reads from Squid.
 
 Regards
 Henrik
  


Yahoo! India Insurance Special: Be informed on the best policies, services, tools and 
more. 
Go to: http://in.insurance.yahoo.com/licspecial/index.html


[squid-users] Malformed urls

2004-02-17 Thread Deepa D
Hi,
   Since the past two days, squid is passing lots of
malformed urls to the redirector program. Samples are
as follows :-
  
^Hý5^Hñ^RBongs.tamilsongs.net/songs/17/eahawtawkea/1599.rm
^Hý5^Hñ^RBongs3.tamilsongs.net/songs/17/eahawtawkea/1599.rm
^Hý5^Hñ^RBongs.tamilsongs.net/songs/17/eahawtawkea/1601.rm
^Hý5^Hñ^RBongs3.tamilsongs.net/songs/17/eahawtawkea/1601.rm
^Hý5^Hñ^RBongs.tamilsongs.net/songs/17/eahawtawkea/2473.rm
hÃ!^Hñ^RBww.google.com/search?sourceid=navclientq=javascript%3Ahistory
   Kindly tell me a solution for this problem.

   IE version is 5.0
   The squid.conf is as follows :-

 http_port 3128
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
acl all src 0/0
no_cache deny all
cache_dir null /usr/local/squid/var/
debug_options ALL,1 61,9 33,5
redirect_program
/usr/local/omniqube/cfilter/cfclient/cf_client
redirect_children 10 
redirect_rewrites_host_header off
authenticate_ttl 10 minutes
authenticate_ip_ttl 10 minutes
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow all
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl our_networks src 9.0.0.0/24 
http_access allow our_networks
http_access deny all
http_reply_access allow all
icp_access allow all
httpd_accel_host virtual
httpd_accel_port 80
httpd_accel_with_proxy on
httpd_accel_uses_host_header on
coredump_dir /usr/local/squid/var/cache
 
  Regards and TIA,
  Deepa
 



Yahoo! India Insurance Special: Be informed on the best policies, services, tools and 
more. 
Go to: http://in.insurance.yahoo.com/licspecial/index.html


Re: [squid-users] Run time errors

2004-02-13 Thread Deepa D
Hi,
   Thanks for the response.
   I have a doubt regarding the browser setting - if
it were to be a browser setting alone, then why don't
I get such error messages in case I configure my
browser to not use the proxy?
   Secondly, the uri_whitespace param is set to the
default value, that is strip. Kindly let me know what
the setting should be.

  Regards and TIA,
 Deepa
 

--- Muthukumar [EMAIL PROTECTED] wrote:  
  similar errors occur.One of them, that I got in
  mail.yahoo.com's inbox page is as follows :-
 Error :- A Runtime Error has occurred.
   Do you wish to Debug
  Line 1
  Error: Syntax error
Yes No
 
 Ok.
 This is the browser's problem.
 Open Internet Explorer - click Tools - Internet
 Options - Advanced. Make sure Disable script
 debugging is ticked and Display a notification about
 every script error is unticked then Apply and
 OK.
 
 The squid is configured to use a redirector
  program. At times malformed urls are passed to the
  client by Squid. One sample is :-
  ?B?Bs.i1.yimg.com/us.yimg.com/i/reg2.css
  ?B?Bs.i1.yimg.com/us.yimg.com/i/mc/mc.js
 
 Ok.
 What is the setting for uri_whitespace tag on
 squid.conf file.
 
 Regards,
 Muthukumar.
  


Yahoo! India Education Special: Study in the UK now.
Go to http://in.specials.yahoo.com/index1.html


[squid-users] Run time errors

2004-02-12 Thread Deepa D
Hi,
   I am having a problem with squid. When I visit a
few websites like www.mail.yahoo.com through the squid
proxy, some run time error popups occur, whereas these
don't appear when not using the proxy. 
   I am currently using squid-2.5.STABLE4.
   Plz let me know why this problem occurs and if it
can be solved by some means.

  Regards and TIA,
 Deepa  


Yahoo! India Education Special: Study in the UK now.
Go to http://in.specials.yahoo.com/index1.html


Re: [squid-users] Squid redirect problem

2004-01-31 Thread Deepa D
Hi,
   Sorry the delay.
   I have pasted the code to this mail.
   Kindly let me know if there is any problem in it.

int sig_hup = 0;

int main (int argc, char *argv[]) {

  int sd, rc;
  struct sockaddr_in localAddr, servAddr;
  struct hostent *h;

  char buff[MAX_BUFF] = ;
  int nread = 0;

  h = gethostbyname(127.0.0.1);
  if(h == NULL) {
log(LOG_ERROR,  Client - unknown host\n );
exit(1);
  }

  servAddr.sin_family = h-h_addrtype;
  memcpy((char *) servAddr.sin_addr.s_addr,
h-h_addr_list[0], h-h_length);
  servAddr.sin_port = htons(SERVER_PORT);

  while(!sig_hup) {

if(fgets(buff, MAX_BUFF, stdin) != NULL) {

/* create socket */
sd = socket(AF_INET, SOCK_STREAM, 0);
if(sd  0) {
log(LOG_ERROR,  Client - cannot open socket\n );
exit(1);
}


/* bind any port number */
localAddr.sin_family = AF_INET;
localAddr.sin_addr.s_addr = htonl(INADDR_ANY);
localAddr.sin_port = htons(0);

rc = bind(sd, (struct sockaddr *) localAddr,
sizeof(localAddr));
if(rc  0) {
log(LOG_ERROR,  Client - cannot bind to
SERVER_PORT\n );
exit(1);
}


rc = connect(sd, (struct sockaddr *) servAddr,
sizeof(servAddr));
if(rc  0) {
log(LOG_ERROR,  Client - could not connect to the
server\n );
exit(1);
}

if(strchr(buff, '\n') == NULL) {
 /* Append a new line character so that the data
gets flushed. */
 strcat(buff,\n);
}
if(write(sd , buff , strlen(buff))   0) {
log(LOG_ERROR, Client - error while writing to
server.\n );
} else {
logger1(LOG_INFO, Client - wrote to server = ,
buff);
}

memset(buff,'\0',MAX_BUFF);
nread = read(sd, buff, MAX_BUFF);
puts(buff);
fflush(stdout);
close(sd);
}

// The client exits when it reads an EOF from stdin.
exit(0);

  }
  *buff = '\0';
  return 0;

}

void contentfilter_HUP(int signal) {

  /* exciting bit of code this one :-) */
  sig_hup = 1;
  log(LOG_INFO, sigHUP received. 
Reconfiguring\n);
}


Regards,
   Deepa 

 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
On Mon, 19 Jan 2004, Deepa D wrote:
 
   FATAL: The redirector helpers are crashing
 too
  rapidly, need help!
 But, when I don't do exit(0) in my C code once 
  read from stdin returns null, I don't have this
  problem.
 
 Then I think you are exiting even if the read does
 not return EOF.
 
 If you want you can post your helper source and we
 take a look.
 
 Regards
 Henrik
  


Yahoo! India Mobile: Download the latest polyphonic ringtones.
Go to http://in.mobile.yahoo.com


Re: [squid-users] Squid redirect problem

2004-01-19 Thread Deepa D
Hi,
   The fatal message is
 FATAL: The redirector helpers are crashing too
rapidly, need help!
   But, when I don't do exit(0) in my C code once 
read from stdin returns null, I don't have this
problem.But, when the squid server shuts down, the
redirector processes are not getting killed.

   Regards,
 Deepa   
   


 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
On Sat, 17 Jan 2004, Deepa D wrote:
 
 The problem with your solution is - I get
 warning
  messages that the redirectors are crashing and the
  helperOpenServers restarts an equal number again.
 
 Then most likely your helpers are crashing. The
 example is correct and is 
 how all existing helpers to Squid operate in
 principle.
 
  But after sometime squid just stops with a fatal
 error message.
 
 What does the error message say?
 
 Regards
 Henrik
  


Yahoo! India Mobile: Download the latest polyphonic ringtones.
Go to http://in.mobile.yahoo.com


Re: [squid-users] Squid redirect problem

2004-01-17 Thread Deepa D
Hi,
   Thanks for the response.
   The problem with your solution is - I get warning
messages that the redirectors are crashing and the
helperOpenServers restarts an equal number again.But
after sometime squid just stops with a fatal error
message.
   Any other solution please!

   Regards,
Deepa   

--- Kinkie [EMAIL PROTECTED] wrote:  Deepa D
[EMAIL PROTECTED] writes:
 
  Hi,
 Squid is not closing the redirect children when
 it 
  is shutdown.Why is it so? Do I have to incorporate
  some code in the redirector program for this?
 
 [...]
 
 The redirectors should suicide when they receive EOF
 on stdin.
 
 In PERL code:
 
 #!/usr/bin/perl
 init();
 while () {
   do_work($_);
 }
 exit(0);
 
 
 -- 
   kinkie (kinkie-squid [at] kinkie [dot] it)
   Random fortune, unrelated to the message:
   Why are you doing this to me?
   Because knowledge is torture, and there must be
 awareness before
 there is change.
   -- Jim Starlin, Captain Marvel, #29 


Yahoo! India Mobile: Download the latest polyphonic ringtones.
Go to http://in.mobile.yahoo.com


[squid-users] Squid redirect problem

2004-01-16 Thread Deepa D
Hi,
   Squid is not closing the redirect children when it 
is shutdown.Why is it so? Do I have to incorporate
some code in the redirector program for this?

   Regards and TIA,
   Deepa


Yahoo! India Mobile: Download the latest polyphonic ringtones.
Go to http://in.mobile.yahoo.com


[squid-users] Help needed...Authentication changes

2003-09-22 Thread Deepa D
Hi All,
I am using squid-2.5.STABLE4 in no cache mode.It
is configured to use pam_auth for authentication.
The requirement that I have is as follows :-
The first time a request comes from a particular
IP , squid should send back a 407 Proxy Authentication
Required message.The browser will prompt the user with
a login page and subsequently validate the user and
cache the login info.But for subsequent requests for a
certain time period(that can be read from a
configuration file based on the user group), the proxy
should not send back a 407 Proxy Authentication
Required message and hence the browser should not
prompt the login page for that time period.
 Kindly tell me how this can be achieved in squid.

   Regards and TIA,
  Deepa

   


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


Re: [squid-users] Authentication related query

2003-09-19 Thread Deepa D
Hi,
   No,the browser is configured to use the proxy.

  Regards and TIA,
  Deepa
   

 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
On Fri, 19 Sep 2003, Deepa D wrote:
 
  The squid is configured to use pam_auth as a
 basic
  auth helper and the cache is disabled. For every
 url
  request a popup window appears asking for user
 name
  and password.
 
 Are you attemting to set up authentication in a
 transparently intercepting 
 proxy? This is not possible to do due to the nature
 of intercepting port 
 80.
 
 To use authentication you MUST have the browser
 configured to use the 
 proxy.
 
 Regards
 Henrik
  


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


Re: [squid-users] Plz help - Connection closing - why?

2003-09-18 Thread Deepa D
Hi,
   A firewall rule has been set that redirects all the
http traffic for port 80 to another port.The program
of ours which is listening on this port reformats this
url and forwards it to the proxy.It also receives the
proxy's response and passes it on to the browser. 
   Regards,
  Deepa

--- Robert Collins [EMAIL PROTECTED] wrote: 
On Thu, 2003-09-18 at 15:07, Deepa D wrote:
  Hi,
  Thanks for the response Robert.I will try and
  explain the setup once again :-
  All the http traffic from the browser is being
  redirected to a program that reformats the url and
 
 Please define 'redirected'. It can mean many things.
 
 Rob
 
 -- 
 GPG key available at:
 http://members.aardvark.net.au/lifeless/keys.txt.
 

 ATTACHMENT part 2 application/pgp-signature
name=signature.asc
 


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


[squid-users] Plz help - Connection closing - why?

2003-09-17 Thread Deepa D
Hi All,
I am working with the latest squid-2.5.STABLE4.It
has been configured to disable cache.The browser has
not been configured to use the proxy.
The first time a request comes to the proxy , it
goes through correctly - if it needs to be redirected
, even that happens.But the subsequent times , the
page cannot be displayed page gets displayed.
Using the squid -k debug , I notice the following
in the cache.log.
In the first case - 
After getting a HTTP HDR REPLY(200 OK) , there is
a Connection: Keep-Alive msg  followed by 
creating entry ... near: 'Connection:Keep-Alive'
But the subsequent times I get HTTP HDR REPLY(200
OK) for the same site with Connection:close.And later
a clientBuildReplyHeader: can't keep-alive, unknown
body size.

   Kindly tell me why this is happening and what
should be done to solve the problem.

   Regards and TIA,
  Deepa
   


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


Re: [squid-users] Plz help - Connection closing - why?

2003-09-17 Thread Deepa D
Hi,
   I feel the following is happening as the browser is
closing much before the proxy can send back a
response.There is an intermediate program that is
trapping the requests from the browser and redirecting
them to the proxy.I tried sending a 100 Continue
message from the intermediate program , but it doesn't
work.I am working with IE 5.5 browser.
   Please tell me how I can make the browser wait to
receive the response from the proxy.
   Regards and TIA,
   Deepa


 --- Deepa D [EMAIL PROTECTED] wrote:  Hi
All,
 I am working with the latest
 squid-2.5.STABLE4.It
 has been configured to disable cache.The browser has
 not been configured to use the proxy.
 The first time a request comes to the proxy , it
 goes through correctly - if it needs to be
 redirected
 , even that happens.But the subsequent times , the
 page cannot be displayed page gets displayed.
 Using the squid -k debug , I notice the
 following
 in the cache.log.
 In the first case - 
 After getting a HTTP HDR REPLY(200 OK) , there
 is
 a Connection: Keep-Alive msg  followed by 
 creating entry ... near: 'Connection:Keep-Alive'
 But the subsequent times I get HTTP HDR
 REPLY(200
 OK) for the same site with Connection:close.And
 later
 a clientBuildReplyHeader: can't keep-alive, unknown
 body size.
 
Kindly tell me why this is happening and what
 should be done to solve the problem.
 
Regards and TIA,
   Deepa

 


 Yahoo! India Matrimony: Find your partner online.
 Go to http://yahoo.shaadi.com 


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


Re: [squid-users] Plz help - Connection closing - why?

2003-09-17 Thread Deepa D
Hi,
Thanks for the response Robert.I will try and
explain the setup once again :-
All the http traffic from the browser is being
redirected to a program that reformats the url and
sends it to the proxy for further processing and in
turn conveys back the proxy's response to teh
browser.The problem that I am facing is that the
browser displays a page not found message even
before the program can convey the proxy's response
back to the browser.
At the proxy end(in the cache.log), I see that the
url is getting processed and a GOT HTTP REPLY HDR
coming in from the remote web server with 200 OK ,
Connection: Close.But this data is not being sent back
to the program.I assume this is because the browser
has already closed its connection.But at the same time
I find the program is still waiting for the squid's
response - it doesn't close its connection with squid.
As per ur suggestion , I will read the HTTP/1.0
specs.Kindly tell me if u can infer anything else as
to what and where the problem could be.
   Regards and TIA,
   Deepa


 --- Robert Collins [EMAIL PROTECTED] wrote: 
On Wed, 2003-09-17 at 23:29, Deepa D wrote:
  Hi,
 I feel the following is happening as the
 browser is
  closing much before the proxy can send back a
  response.There is an intermediate program that is
  trapping the requests from the browser and
 redirecting
  them to the proxy.I tried sending a 100 Continue
  message from the intermediate program , but it
 doesn't
  work.I am working with IE 5.5 browser.
 Please tell me how I can make the browser wait
 to
  receive the response from the proxy.
 Regards and TIA,
 Deepa
 
 connection closing is perfectly normal part of
 operation. Squid is
 http/1.0 with a few 1.1 features - but it doesn't
 support Continue. 
 
 I'm a little unclear on what is going on here, but
 it sounds like:
 1) You are writing a proxy of some sort, using squid
 to get the data
 from the internet.
 2) You are having a problem where the connection
 from squid to your
 proxy is closing.
 
 Whats not clear is /when/ it is closing. As I said
 above, closing the
 connection is normal for http operation. If you are
 coding to the
 http/1.1 specification, you should read the section
 on HTTP/1.0
 persistent connection interoperation, and probably
 the HTTP/1.0 spec
 too.
 
 Rob
 
 -- 
 GPG key available at:
 http://members.aardvark.net.au/lifeless/keys.txt.
 

 ATTACHMENT part 2 application/pgp-signature
name=signature.asc
 


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


[squid-users] Redirection problem

2003-09-16 Thread Deepa D
Hi,
   The squid proxy is passing the url to the redirect
program which inturn is writing a redirect url which
is being read by squid.The redirect url happens to be
a local url(a simple html page in the local
webserver).But then the redirected page doesn't get
displayed.The cache.log displays different messages at
for different trials :-

1) ClientProcessRequest: TCP_MISS for
http://10.10.10.5/mypage.html
ClientBuildReplyHeader: can't keep-alive,unknown body
size

2) -- NO: entry older than client

3) A TCP_HIT happens and the clientWriteComplete: FD
21 transfer is DONE 

  But in none of the above cases the page displays.

  Kindly tell me what the problem could be.

  Regards and TIA,
  Deepa




Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


RE: [squid-users] Authentication problem

2003-09-15 Thread Deepa D
Hi,
   Yes, we need to screen all the url requests even if
the client machines are not configured to use a proxy.
Kindly mail me any solutions that we could use to
overcome this problem.
  Regards and TIA,
  Deepa
 
 --- Adam Aube [EMAIL PROTECTED] wrote:  
   The browsers are not configured to use the
 proxy -
  hence the pam_auth of the squid proxy cannot be
 used
  for authentication.
 
 Is there a particular reason you're using a
 transparent proxy?
 
 Adam 


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


[squid-users] Bypass authentication for local urls

2003-09-15 Thread Deepa D
Hi,
   The squid proxy has been configured to authenticate
using pam_auth.But , I have a requirement where this
authentication has to be bypassed when requests come
for local urls(the pages hosted by our local
webserver).Kindly mail me how to go about doing this.
  Regards and TIA,
 Deepa
 


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


Re: [squid-users] Plz help - Redirection not happening

2003-09-12 Thread Deepa D
Hi,
   Thanks for the info Henrik.As per the logs the
reason why the redirection is failing is as follows :-
   In the redirection program , the code is
interacting with another server(using its SDK).During
this process , one SDK function is writing some
comments to the stdout(We can't alter that function
though).These are being read by the squid proxy and
the redirection is not happening as desired.
As this interaction happens only in the beginning
, I want to take care of this by making the squid to
send some dummy url to the redirect program only once
when it starts up and as expected it will fail.We have
tried trapping stdout(and stderr) during that process
but it doesn't help.
Kindly tell me how to go about doing this or plz
suggest an alternative solution that will stop the
proxy from reading the function's comments but must
read only the url written.
   Regards and TIA,
   Deepa
   
--- Henrik Nordstrom [EMAIL PROTECTED] wrote:  On
Thursday 11 September 2003 06.01, Deepa D wrote:
 
I am using squid-2.5.STABLE1 for redirection
 in
  my setup.The redirect program is writing the
  redirect_url , src address , identity and method
  to the stdout , but the redirection is not
 happening.I
  have tried with (printf, fflush )combination ,
  (sprintf , fprintf , fflush)combination
 
 Should work..
 
 try
 
 debug_options ALL,1 61,9 33,5
 
 this should give you a clear picture of what is
 going on in cache.log.
 
 Regards
 Henrik
 
 -- 
 Donations welcome if you consider my Free Squid
 support helpful.

https://www.paypal.com/xclick/business=hno%40squid-cache.org
 
 If you need commercial Squid support or cost
 effective Squid or
 firewall appliances please refer to MARA Systems AB,
 Sweden
 http://www.marasystems.com/, [EMAIL PROTECTED] 


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


Re: [squid-users] Plz help - Redirection not happening

2003-09-12 Thread Deepa D
Hi Henrik,
Thanks a lot for the solution.It works :-).

  Regards,
 Deepa


 --- Henrik Nordstrom [EMAIL PROTECTED] wrote: 
fre 2003-09-12 klockan 08.29 skrev Deepa D:
 
  interacting with another server(using its
 SDK).During
  this process , one SDK function is writing some
  comments to the stdout(We can't alter that
 function
  though).These are being read by the squid proxy
 and
  the redirection is not happening as desired.
 
 Then change your program to redirect stdout to
 stderr before
 initializing the SDK, keeping the original stdout
 fdhandle for return
 values to Squid only.
 
 If on unix this exercise is as simple as follows
 
// create another out filehandle from stdout
FILE *out = fdopen(dup(1), w);
 
// redirect stdout to stderr
dup2(2, 1);
 
// Disable buffering on output channel
setbuf(out, NULL);
 
 
 then write your redirector output to out instead of
 stdout.
 
 Regards
 Henrik
 

 -- 
 Donations welcome if you consider my Free Squid
 support helpful.

https://www.paypal.com/xclick/business=hno%40squid-cache.org
 
 Please consult the Squid FAQ and other available
 documentation before
 asking Squid questions, and use the squid-users
 mailing-list when no
 answer can be found. Private support questions is
 only answered
 for a fee or as part of a commercial Squid support
 contract.
 
 If you need commercial Squid support or cost
 effective Squid and
 firewall appliances please refer to MARA Systems AB,
 Sweden
 http://www.marasystems.com/, [EMAIL PROTECTED]
  


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


[squid-users] Plz help - Redirection not happening

2003-09-10 Thread Deepa D
 Hi All,
  I am using squid-2.5.STABLE1 for redirection in
my setup.The redirect program is writing the
redirect_url , src address , identity and method
to the stdout , but the redirection is not happening.I
have tried with (printf, fflush )combination ,
(sprintf , fprintf , fflush)combination
,(sprintf,puts,fflush) combination but , nothing seems
to be triggering the proxy to accept the new redirect
url.It works occasionally but not most of the times.
Is there any option that squid should be configured
with for redirection to work.
  Kindly help me solve this problem. 
Regards and TIA,
 Deepa
 


Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com


[squid-users] Redirection not happening

2003-09-09 Thread Deepa D
Hi All,
 I am using squid-2.5.STABLE1
 for redirection in my setup.The redirect program is
writing the
 redirect_url , src address , identity and method to
the stdout ,
 but the redirection is not happening.I have tried
with (printf,
 fflush )combination , (sprintf , fprintf , fflush
)combo ,
 (sprintf,puts,fflush) combo but , nothing seems to
be triggering
 the proxy to accept the new redirect url.Kindly help
me solve this
 problem. 
Secondly , the proxy is sending out malformed
urls(some junk
 characters are getting prefixed to the url).How can
I take care of
 this? 
How to prevent the proxy from sending out the urls of
the
 popups that accompany any url to the redirect
program? 
Also , could somebody kindly tell me as to how to
enforce the proxy to always generate the url with the
domain name itself and not replace the domain name by
its IP before it passes it on to the redirector
program. 
Any suggestions are most welcome.
  Regards and TIA,
Deepa



Yahoo! India Matrimony: Find your partner online.
Go to http://yahoo.shaadi.com