Re: virtual hosting methods

2001-11-25 Thread Martin 'pisi' Paljak
As of 1.3.22 it reads everything .file and file~ :( Easy to fix but aint got no time nor interest. -- Martin 'pisi' Paljak / freelancer consultant [EMAIL PROTECTED] / pisi.pisitek.com www.pisitek.com On 24 Nov 2001, Karl M. Hegbloom wrote: Frank == Frank Louwers [EMAIL PROTECTED] writes:

Re: virtual hosting method

2001-11-25 Thread Martin 'pisi' Paljak
OK, I'll write a patch... you'' get it within an hour or so.. regards, -- Martin 'pisi' Paljak / freelancer consultant [EMAIL PROTECTED] / pisi.pisitek.com www.pisitek.com On Sun, 25 Nov 2001, Martin 'pisi' Paljak wrote: As of 1.3.22 it reads everything .file and file~ :( Easy to fix but

Re: rogue Chinese crawler

2001-11-25 Thread Chris Wagner
The best way would be to block it at your router with an access list. Blocking it at the box is ok too but that takes a little bit of your resources. And you have to do it on each box on your network you want protected. The router block will protect your entire network in one fell swoop and

Re: virtual hosting methods

2001-11-25 Thread Gavin Hamill
On Sat, Nov 24, 2001 at 06:44:02PM -0500, Kevin J. Menard, Jr. wrote: MpP For simple masshosting I still suggest mod_vhost. Which brings me back to my original question. For simple masshosting, I would agree. But what about a system where some vhosts have CGI or SSI access for example,

Re: virtual hosting methods

2001-11-25 Thread Mark Aitchison
Gavin Hamill wrote: This is my biggest problem and a significant security hole :/ I have a directory /www containing all the vhosting directories, named domain.com, etc. the entire directory tree is owned by a user called virtual, and everyone has CGI, PHP and SSI access. In this way

Re: rogue Chinese crawler

2001-11-25 Thread Martin WHEELER
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 OK, I've now been 24 hours without a hit, so I'm presuming I've got rid of all the crawlers. Thanks for all the help and advice from both lists. Resume: - - the openfind.com(.tw) 'bots don't respect the norobots conventions, so your robots.txt is

Re: virtual hosting methods

2001-11-25 Thread Martin 'pisi' Paljak
As of 1.3.22 it reads everything .file and file~ :( Easy to fix but aint got no time nor interest. -- Martin 'pisi' Paljak / freelancer consultant [EMAIL PROTECTED] / pisi.pisitek.com www.pisitek.com On 24 Nov 2001, Karl M. Hegbloom wrote: Frank == Frank Louwers [EMAIL PROTECTED] writes:

Re: virtual hosting method

2001-11-25 Thread Martin 'pisi' Paljak
OK, I'll write a patch... you'' get it within an hour or so.. regards, -- Martin 'pisi' Paljak / freelancer consultant [EMAIL PROTECTED] / pisi.pisitek.com www.pisitek.com On Sun, 25 Nov 2001, Martin 'pisi' Paljak wrote: As of 1.3.22 it reads everything .file and file~ :( Easy to fix but

Re: rogue Chinese crawler

2001-11-25 Thread Chris Wagner
The best way would be to block it at your router with an access list. Blocking it at the box is ok too but that takes a little bit of your resources. And you have to do it on each box on your network you want protected. The router block will protect your entire network in one fell swoop and cost

Installing PPP 2.4.0

2001-11-25 Thread Ben Hill
Hi, Sorry for the cross posting, but I am really stuck! I am currently setting up my Debian machine to connect to my ADSL modem for internet access. I have had everything working before, but I am having problems this time with the PPP daemon! I am trying to install the ppp-2.4.0 tarball, but

Re: virtual hosting methods

2001-11-25 Thread Mark Aitchison
Gavin Hamill wrote: This is my biggest problem and a significant security hole :/ I have a directory /www containing all the vhosting directories, named domain.com, etc. the entire directory tree is owned by a user called virtual, and everyone has CGI, PHP and SSI access. In this way

Re: rogue Chinese crawler

2001-11-25 Thread Martin WHEELER
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 OK, I've now been 24 hours without a hit, so I'm presuming I've got rid of all the crawlers. Thanks for all the help and advice from both lists. Resume: - - the openfind.com(.tw) 'bots don't respect the norobots conventions, so your robots.txt is