Maybe your reason for saying it's not caching downloads, is you're
using a download manager that tells squid to go around the cache?
(I was having that problem with DownThemAll had to search for a
solution online.)
On Wed, Jan 12, 2011 at 1:16 PM, Amos Jeffries wrote:
>
> On 11/01/11 23:58, p...@
I just reformatted my server (some ppp errors) and installed squid. I
restored the cache since it was on a separate partition.
However, now I'm getting errors.
assertion failed: store_swapout.cc:317: "mem->swapout.sio == self"
On this error, squid restarts. This means that whatever request it was
ching.
Also, recursion with wget can be limited. You'll notice that both
Flaviane and I used the '-l 1' switch. We both thought that it's not
worth going more then one page down from any already cached pages.
On Wed, Oct 6, 2010 at 12:27 PM, John Doe wrote:
>
> From: Isa
How would you do it?
with wget, the only way of having it crawl through websites, is to
recurse... isn't it?
I tried screwing around, and the best I came up with was this:
>#!/bin/bash
>log="/var/log/squid3/access.log"
>
>while (true); do
>echo "reading started: `date`, log file: $log"
>s
On Wed, Jul 21, 2010 at 4:57 PM, Marcus Kool
wrote:
> yes.
> 1) the index is in memory and needs 10-20 MB index in memory for each GB on
> disk
I was under the impression (from the oriely squid manual) that recent
versions do not use up extra RAM with bigger caches.
But maybe I read it wrong?
Al
On Wed, May 26, 2010 at 3:24 PM, Amos Jeffries wrote:
> Isaac Witmer wrote:
>>
>> I'm getting an error when I attempt to view the website:
>> http://cpanel.byethost.com/index.php
>>
>> ERROR
>> The requested URL could not be retriev
I'm getting an error when I attempt to view the website:
http://cpanel.byethost.com/index.php
ERROR
The requested URL could not be retrieved
While trying to retrieve the URL: http://cpanel.byethost.com/index.php
The following error was encountered:
Access Denied.
Access control configuration preve
On 02/05/2010 01:11 PM, Amos Jeffries wrote:
> Isaac Witmer wrote:
>> On 02/04/2010 03:41 PM, Amos Jeffries wrote:
>>> Isaac Witmer wrote:
>>>> Sorry, I did a bad job of explaining.
>>>> I had SquidGuard as a url_rewrite_program redirecting all Ubunt
27;t understand it entirely but it seems adding a cache deny rule for
"localnet" (which was already defined for my local area network) also
helped to blacklist the 10.42.43.1 ip address.
-Isaac
On 02/02/2010 01:43 PM, Isaac Witmer wrote:
> I'm trying to run squid alongside apt-pr
I'm trying to run squid alongside apt-proxy. To keep things cleaner, I'm
trying to keep squid from caching apt-proxy requests, or basic Ubuntu
repositories.
I added this code to my squid.conf file:
acl ubuntu_repo dstdomain archive.ubuntu.com archive.canonical.com
security.ubuntu.com ke.archive.ub
10 matches
Mail list logo