[SLUG] remote mail system

2008-10-22 Thread Jonathan
Hi All, I have quite a neat system set up with my email. Fetchmail downloads it via POP3 from ISP (evil telstra). Dovcote Imap along with squirrelmail allow me to access my email (nicely filtered and sorted) over the net. Problem: I can't send email using this system. I have to use a work aroun

Re: [SLUG] remote mail system

2008-10-22 Thread Robert Thorsby
On 22/10/08 18:17:21, Jonathan wrote: > Problem: > I can't send email using this system. I have to use > a work around from KMail (set up like I had it before > all this, just simple POP3 client). This means I can't > send emails from my computer over the net. I think > port 25 is blocked, so It ne

Re: [SLUG] remote mail system

2008-10-22 Thread Adelle Hartley
Jonathan wrote: Hi All, I have quite a neat system set up with my email. Fetchmail downloads it via POP3 from ISP (evil telstra). Dovcote Imap along with squirrelmail allow me to access my email (nicely filtered and sorted) over the net. Problem: I can't send email using this system. I have

Re: [SLUG] remote mail system

2008-10-22 Thread Kyle
The thing is; these days with so much spam floating around, using your own mail server to "send" mail is as much hit and miss as it is trouble setting up and maintaining. There are so many orgs that blacklist an IP for no good reason, a small organisation has very little chance of getting itsel

Re: [SLUG] remote mail system

2008-10-22 Thread Andrew Cowie
On Wed, 2008-10-22 at 21:19 +1100, Kyle wrote: > Alternatively set your clients up to use your mail server for SMTP, then > have your mail server relay through your ISP's SMTP server. Nothing wrong with doing it this way. Hey, email once was a store and forward protocol. :) AfC Sydney signatu

Re: [SLUG] Fortress .... err Firewall Australia

2008-10-22 Thread bill
YES! AUSTRALIA is the pilot! Sounds like Paypal. Are we so gullible? -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

[SLUG] Search engine traffic dominates

2008-10-22 Thread Peter Chubb
Hi, I'm a little cheesed off. In the last three months, people have downloaded 9G per month from our website; search engines have downloaded 21G per month. Only Google generated significant traffic through search engine hits (and it downloaded less than the others, too --- around 2G per month,

[SLUG] ls lists numbers, not owner names

2008-10-22 Thread Voytek Eymont
I'm trying to fix my failed clam install, and, just noticed, when I list certain files, I get owner/group not as names, but, as numbers; what is that trying to tell me ? # ls -al /var/log/clamav total 188 drwxr-xr-x 2 104 105 4096 Sep 3 02:31 . drwxr-xr-x 16 root root 4096 Oct 19 04:12 .

Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Tony Sceats
Can't you use robots.txt (or the modern equiv, is there anything newer actually?) to stop mass indexing, perhaps point it to pages you want indexed and also tell it to exclude images etc etc? On Thu, Oct 23, 2008 at 10:45 AM, Peter Chubb <[EMAIL PROTECTED]>wrote: > > Hi, > I'm a little cheesed o

Re: [SLUG] ls lists numbers, not owner names

2008-10-22 Thread DaZZa
On Thu, Oct 23, 2008 at 10:48 AM, Voytek Eymont <[EMAIL PROTECTED]> wrote: > I'm trying to fix my failed clam install, and, just noticed, when I list > certain files, I get owner/group not as names, but, as numbers; > > what is that trying to tell me ? > > # ls -al /var/log/clamav > total 188 > dr

Re: [SLUG] ls lists numbers, not owner names

2008-10-22 Thread Daniel Pittman
"Voytek Eymont" <[EMAIL PROTECTED]> writes: > I'm trying to fix my failed clam install, and, just noticed, when I list > certain files, I get owner/group not as names, but, as numbers; > > what is that trying to tell me ? The entries in /etc/passwd and /etc/group[1] that map the UID 104 and the G

Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Rev Simon Rumble
This one time, at band camp, Peter Chubb wrote: > I'm a little cheesed off. In the last three months, people have > downloaded 9G per month from our website; search engines have > downloaded 21G per month. Only Google generated significant traffic > through search engine hits (and it downloaded

Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Mary Gardiner
On Thu, Oct 23, 2008, Tony Sceats wrote: > Can't you use robots.txt (or the modern equiv, is there anything newer > actually?) to stop mass indexing, perhaps point it to pages you want indexed > and also tell it to exclude images etc etc? As I understand it, robots.txt is still the way to do this.

Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Mary Gardiner
On Thu, Oct 23, 2008, Rev Simon Rumble wrote: > You might want to look into the Crawl-delay extension to the robots.txt > standard, which can limit by robot: > http://en.wikipedia.org/wiki/Robots.txt#Crawl-delay_directive There's also the Sitemaps protocol, in which you can suggest how frequently

Re: [SLUG] ls lists numbers, not owner names

2008-10-22 Thread Voytek Eymont
On Thu, October 23, 2008 10:55 am, DaZZa wrote: >> -rw-r--r-- 1 104 105 0 Jul 27 04:12 freshclam.log > The username associated with the UID which created/owned those files > is no longer listed in /etc/passwd. Nor the group in /etc/group. > clamav:x:104:106:User for clamav:/var/run/hal:/b

Re: [SLUG] ls lists numbers, not owner names

2008-10-22 Thread DaZZa
On Thu, Oct 23, 2008 at 1:08 PM, Voytek Eymont <[EMAIL PROTECTED]> wrote: > On Thu, October 23, 2008 10:55 am, DaZZa wrote: > DaZZa, Daniel, > > thanks > > how to fix, can I recreate clam entry with 'mc' editor ? > or do I need to 'adduser' ? Easiest way is to just use useradd. Editing /etc/passwd

Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Peter Chubb
> "Mary" == Mary Gardiner <[EMAIL PROTECTED]> writes: Mary> On Thu, Oct 23, 2008, Tony Sceats wrote: >> Can't you use robots.txt (or the modern equiv, is there anything >> newer actually?) to stop mass indexing, perhaps point it to pages >> you want indexed and also tell it to exclude images e