On 19/08/2014 3:42 a.m., nuhll wrote:
> Just to clarify my problem: I dont use it as a transparente proxy! I
> distribute the proxy with my dhcp server and a .pac file. So it gets used on
> all machines with "auto detection proxy"
>
Your earlier config file posted contained:
http_port 192.168.
Thanks for no help, but could u please spam then?
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667247.html
Sent from the Squid - Users mailing list archive at Nabble.com.
http://www.squid-cache.org/Doc/config/cache/
On 03/08/14 10:25, nuhll wrote:
Seems like "acl all src all" fixed it. Thanks!
One problem is left. Is it possible to only cache certain websites, the rest
should just redirectet?
--
View this message in context:
http://squid-web-proxy-cache.101
Just to clarify my problem: I dont use it as a transparente proxy! I
distribute the proxy with my dhcp server and a .pac file. So it gets used on
all machines with "auto detection proxy"
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websi
What is pnp. Do you mean UPNP? Its enabled. I dont understand RU. If i were
able to read and understand it, why u think i post it here? Just so that u
tell me thats the answer?!
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp466
On 16/08/2014 8:02 a.m., nuhll wrote:
> I got nearly all working. Except Battle.net. This problem seems to known, but
> i dont know how to fix.
>
> http://stackoverflow.com/questions/24933962/squid-proxy-blocks-battle-net
That post displays a perfectly working proxy transaction. No sign of an
err
I got nearly all working. Except Battle.net. This problem seems to known, but
i dont know how to fix.
http://stackoverflow.com/questions/24933962/squid-proxy-blocks-battle-net
https://forum.pfsense.org/index.php?topic=72271.0
--
View this message in context:
http://squid-web-proxy-cache.101909
Hello,
thanks for your help.
I fixed the slow issue by myself, i forgott to add nameservers, so it was
using the local dns, which ofc fakes some ips... i added nameservers
directive and it works now fast again.
root@debian-server:~# cat /proc/version
Linux version 3.2.0-4-amd64 (debian-ker...@l
On 12/08/2014 7:57 a.m., nuhll wrote:
> Thanks for your help.
>
> But i go crazy. =)
>
> Internet is slow as fuck. I dont see any errors in the logs. And some
> services (Battle.net) is not working.
>
> /etc/squid3/squid.conf
> debug_options ALL,1 33,2
> acl domains_cache dstdomain "/etc/squid/l
Thanks for your help.
But i go crazy. =)
Internet is slow as fuck. I dont see any errors in the logs. And some
services (Battle.net) is not working.
/etc/squid3/squid.conf
debug_options ALL,1 33,2
acl domains_cache dstdomain "/etc/squid/lists/domains_cache"
cache allow domains_cache
acl localnet
Well, english is not my native language too, but that does not hurt much :)
1. Define an access list (text file with domains you wanna cache, one
domain per line):
acl domains_cache dstdomain "/etc/squid/lists/domains_cache.txt"
2. Define a parameter that will allow cache for these domains, wh
Thanks for your answer.
Ill try to get it working but im not sure how. I dont understand this "acl"
system. I know there are alot of tutorials out there, but not in my mother
language so im not able to fully understand such expert things.
Could you maybe show me atleast at one exampel how to get
Piece of cake:
always_direct deny acl_not_direct
always_direct allow all
On 05.08.2014 23:19, nuhll wrote:
Thanks, but its not possible to make a list of all possible websites which i
could visit but i dont want to cache xD.
Is there no way to direct ALL websites direct EXCEPT only some websit
Thanks, but its not possible to make a list of all possible websites which i
could visit but i dont want to cache xD.
Is there no way to direct ALL websites direct EXCEPT only some websites?
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain
You should create an access list with sites that you don't want to cache
like:
always_direct allow acl_direct_sites
always_direct allow all will make ALL requests to go directly bypassing cache.
Also see cache_deny directive.
On 04.08.2014 22:25, nuhll wrote:
always_direct allow all
and then
always_direct allow all
and then my other code, or i need to add it before?
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667136.html
Sent from the Squid - Users mailing list archive at Nabble.com.
always_direct directive
On 04.08.2014 22:15, nuhll wrote:
Hello,
you are right. I dont mean redirect like 301.
I mean, squid should not touch the website or connection and just send it
direct to the website, except some websites which i want to cache.
How to archive this?
--
View this messa
Hello,
you are right. I dont mean redirect like 301.
I mean, squid should not touch the website or connection and just send it
direct to the website, except some websites which i want to cache.
How to archive this?
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabbl
On 3/08/2014 9:25 p.m., nuhll wrote:
> Seems like "acl all src all" fixed it. Thanks!
>
> One problem is left. Is it possible to only cache certain websites, the rest
> should just redirectet?
The "cache" directive is used to tell Squid any transactions to be
denied storage (deny matches). The re
Seems like "acl all src all" fixed it. Thanks!
One problem is left. Is it possible to only cache certain websites, the rest
should just redirectet?
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667127.html
Sent from t
On 3/08/2014 3:07 a.m., nuhll wrote:
> im not able to fix it.
>
> Normal websites work. But i cant get it to cache (or even allow access to
> Windows Update or Kaspersky).
>
> Whats i am doin wrong?
>
> 2014/08/02 17:05:35| The request GET
> http://dnl-16.geo.kaspersky.com/updaters/updater.xml i
im not able to fix it.
Normal websites work. But i cant get it to cache (or even allow access to
Windows Update or Kaspersky).
Whats i am doin wrong?
2014/08/02 17:05:35| The request GET
http://dnl-16.geo.kaspersky.com/updaters/updater.xml is DENIED, because it
matched 'localhost'
2014/08/02 17:
22 matches
Mail list logo