mod_accel redirects

2003-07-17 Thread Philip Mak
Given this server configuration (this is a front-end lightweight
Apache, which uses mod_accel to proxy to a back-end mod_perl Apache):

ServerName www.shoujoai.com
ServerAlias shoujoai.com
AccelPass / http://127.0.0.1:8002/

and given a file called redir.asp, which contains the following:

% $Response-Redirect('http://127.0.0.1:8002/mush/'); %

And given that I issue the following query to the web server:

GET /redir.asp HTTP/1.1
Host: shoujoai.com

and the following response:

Location: http://www.shoujoai.com/mush/

Is there a way to make mod_accel return shoujoai.com instead of
www.shoujoai.com when it rewrites the location header there? It's
using the default ServerName, instead of the Host header that the
client requested. This causes problems when cookies is set on an
alternative hostname, but the web browser gets redirected to the main
hostname. Is there a way to make it use the Host header when rewriting
an internal redirection URL?


Why is my mod_perl's @INC different?

2002-12-28 Thread Philip Mak
When I use perl from the command line, my @INC is this:

$ perl -eprint join(':', @INC)
/usr/lib/perl5/5.6.1/i386-linux:/usr/lib/perl5/5.6.1:/usr/lib/perl5/site_perl/5.6.1/i386-linux:/usr/lib/perl5/site_perl/5.6.1:/usr/lib/perl5/site_perl/5.6.0:/usr/lib/perl5/site_perl:.

When I print @INC from a mod_perl script, it is this:

/home/mirror/global:/usr/lib/perl5/5.6.1/i686-linux:/usr/lib/perl5/5.6.1:/usr/lib/perl5/site_perl/5.6.1/i686-linux:/usr/lib/perl5/site_perl/5.6.1:/usr/lib/perl5/site_perl:.:/home/mirror/httpd/:/home/mirror/httpd/lib/perl

How did this happen? Why does my command line perl use i386-linux,
while my mod_perl uses i686-linux?

My problem is, any modules I install go into i386-linux. They're not
accessible to mod_perl. I need to install a new module for my site but
I can't access it in mod_perl.

I've tried adding use lib qw(/usr/lib/perl5/5.6.1/i386-linux); in
startup.pl, but then it makes Apache not start at all:

/usr/local/perlhttpd/bin/httpd: relocation error: 
/usr/lib/perl5/5.6.1/i386-linux/auto/Data/Dumper/Dumper.so: undefined symbol: 
perl_get_sv
/home/mirror/bin/apachectl start: httpd could not be started

Any idea what's going on here?



Re: Apache 2?

2002-11-30 Thread Philip Mak
On Sat, Nov 30, 2002 at 12:45:50PM -0500, Jason Czerak (Jasnik) wrote:
 Is the 'front end' and 'back end' apache servers on the 'same box'?
 My problme is that I had one web server. and I did the FE and BE bit
 (BE being on the loop back address). to free up some major resources
 since mod_perl apache gets huges. I didn't need 20meg process
 serving up 2K images :) and had about 20 to 30 smaller apache
 process doing the 'static' content serving.

Yes, that's exactly what I do.

 I have found that the memory resource problem doesn't excist with
 2.0 when you compile with 'worker' or fully threaded.  I'm running 2
 processes of apache and each of htem have like 20 threaded.
 performce seems good with just running one apache server.  didn't do
 any real load testing, but I'm sure 2.0 is going to blow 1.3.x away.

Well, there's multiple benefits of running a separate frontend and
backend server:

1. As stated above, the static HTML/GIF/JPG/etc. files don't have to
be served by the heavyweight mod_perl process.

2. If the backend is serving a large file, the frontend can retrieve
the entire file from the backend and free it up immediately, so that a
client with a slow modem will not tie up the backend for the time it
takes to download.

3. If you have different sites (presumably owned by different people)
on your server, all the backend servers can execute with different
userids so that the backend server of one site doesn't have to be able
to read the files of another site. And, everyone can change their own
server configuration.

We know that Apache 2 confers benefit #1 without needing a separate
frontend and backend. Benefit #2 seems to be planned, but isn't here
yet. ...What about benefit #3?



Apache 2?

2002-11-26 Thread Philip Mak
These days, Apache 2 has become the default version of Apache.

On my site, I run a front end Apache and a back end Apache.

Front end: Apache 1.x, has mod_accel module which is like mod_proxy,
but downloads all the data from the backend ASAP and frees it up
immediately, so that a slow modem doesn't tie up the backend

Back end: Apache 1.x with mod_perl

Here's my question:

Is it worth upgrading to Apache 2.x for either the front end or back
end? And does Apache 2.x's mod_proxy free up the backend ASAP now?



Outdated link at http://perl.apache.org/products/products.html

2002-11-26 Thread Philip Mak
I couldn't find a contact address on the modperl website, so I'm
posting this here...

On http://perl.apache.org/products/products.html there is an outdated
link to mwForum. The new URL is: http://www.mwforum.org/



Re: Apache 2?

2002-11-26 Thread Philip Mak
On Tue, Nov 26, 2002 at 11:40:00AM -0800, Grant Cooper wrote:
 What do yo mean a modem will tie up the Server? I've never heard this
 before.

Let's say you have a mod_perl page that returns a 100k document, and a
28.8k modem downloads that document.

The mod_perl process that is serving that document will be tied up
until that modem finishes downloading the document, which is
inefficient since the mod_perl processes take up a lot of memory. A
lightweight front-end proxy that loads the data from the mod_perl
process all at once and then feeds it to the modem would save memory.



Re: Apache 2?

2002-11-26 Thread Philip Mak
On Tue, Nov 26, 2002 at 03:11:47PM -0800, Grant Cooper wrote:
 Is there any documention of a HOWTO or a tutorial about a lightweight
 front-end proxy that loads the data from the mod_perl

I wrote a guide a while back on how to install mod_accel and
mod_deflate with Apache. It's for Apache 1.3.x; I don't know if it
will work with Apache 2.x.
http://www.aaanime.net/pmak/apache/mod_accel/



Re: apache mod_perl + suid question

2002-07-26 Thread Philip Mak

On Fri, Jul 26, 2002 at 06:40:31PM -0400, [EMAIL PROTECTED] wrote:
 1: The usermod command doesn't get executed. I have tried debugging
 this...by having a log file(/usr/local/apache/logs) and the mod_perl
 process does open the wrapper script..but then does nothing. It does
 not  execute the command. What am I doing wrong ?

Try '/usr/sbin/usermod' instead of 'usermod'. It may be a path issue.

Also, 'usermod' might have to be run interactively (rather than
reading from standard input), so you may have to create a virtual
terminal in order to interface with usermod. (I might be wrong on
this, and I can't elaborate further.)



RewriteRule and AccelPass conflict

2002-06-24 Thread Philip Mak

I'm trying to add a RewriteRule, but it's not working:

RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.animewallpapers.com(:80)?$
RewriteCond %{HTTP_HOST} !^64.246.28.97(:80)?$
RewriteRule ^/(.*) http://www.animewallpapers.com/$1 [L,R]

I want to make it so that if someone accesses that website via any
hostname other than www.animewallpapers.com or 64.246.28.97, then it
will redirect them to http://www.animewallpapers.com/. I copied those
rules exactly from another httpd.conf where it works.

However, the directives were being ignored. I realized the problem is
probably because of this line:

AccelPass / http://127.0.0.1:8010/

The site has a mod_perl backend running on port 8010. Apache is
proxying those connections to port 8010 before it has a chance to
invoke the RewriteRule above.

I thought of putting the RewriteRule on the backend mod_perl, but I
don't think it will work there because the HTTP_HOST would have been
changed to 127.0.0.1.

Any suggestions on how I can get the RewriteRule to take precedence
over the AccelPass?



Re: RewriteRule and AccelPass conflict

2002-06-24 Thread Philip Mak

On Mon, Jun 24, 2002 at 06:12:04PM -0400, Robert Landrum wrote:
 I would think that you would need something like the following. 
 
 Location /
 SetHandler rewrite accel
 # rewrite rules and accel rules
 /Location
 
 Or something like that...  Your goal is to stack these handlers, so that
 rewrite happens first and accel second.

That doesn't work:

$ apachectl configtest
Syntax error on line 1011 of /usr/local/apache/conf/httpd.conf:
SetHandler takes one argument, a handler name



[OT] Re: Apache Web Server vulnerability

2002-06-21 Thread Philip Mak

On Fri, Jun 21, 2002 at 05:31:00AM -0700, Ask Bjoern Hansen wrote:
 64bit binaries are exploitable.  There are also exploits for several
 32bit systems.

Does anyone know if Red Hat Linux 7.2 on i686 is vulnerable to the
remote shell (not the DoS) exploit?



Re: Scripts and passwd

2002-05-19 Thread Philip Mak

On Sun, May 19, 2002 at 03:56:43AM -0500, [EMAIL PROTECTED] wrote:
 As for risky. Well the whole point of the script system is to add a pop mail
 box for a user. But in order to do this i have to do the following:
 
 add user to the passwd/shadow file
 add user to the virtusertable and genericstable
 recompile the sendmail config files
 
 Then and only then is the new mailbox ready for use. This is the only way I
 can think of to accomplish this via an automated web proccess. I dont even
 know if you can do it any other way with out touching the passwd/shadow
 files?

If all you want to do is give out POP3 mailboxes, you can accomplish
this by doing something at the MTA (Mail Transport Agent, aka mail
server) level.

For example, installing qmail (http://www.lifewithqmail.org/lwq.html)
with qmail-pop3d [note: qmail replaces sendmail] and VMailMgr
(http://www.vmailmgr.org/). Under this configuration, adding a new
POP3 mailbox would involve just changing files owned by a normal user
of the system (instead of root).

Advantages of my solution:
- Increased security. Everything in your mailbox system would be owned
  by an unpriviledged user of the system rather than root.
- qmail/Maildir is generally higher performance than sendmail/mbox.

Disadvantages of my solution:
- You have to replace sendmail with qmail and relearn some stuff.
  Be prepared to spend a few hours figuring stuff out.

 You could migrate to a database based mail authentication solution.
 Postfix+cyrus springs to mind.

The above is also a valid way to do it, with similar advantages and
disadvantages as my solution. (Postfix replaces sendmail.)



How to cancel AccelNoPass in mod_accel

2002-04-11 Thread Philip Mak

Does anyone know how I can cancel AccelNoPass in mod_accel?

I have the following configuration in httpd.conf:

AccelNoPass ~\*.cgi$ ~\*.html$

VirtualHost ...
ProxyPass / http://localhost:8001/
...
/VirtualHost

I want to make it so that inside the VirtualHost container, ~\*.cgi
will not be in AccelNoPass. (The reason I have ~\*.cgi in AccelNoPass
is because there's some other VirtualHosts that shouldn't have it
passed.)

Is it possible to do this?



mod_deflate problem with chunked encoding

2002-01-17 Thread Philip Mak

The following webpage on a mod_deflate enabled server is not working
correctly in some browsers:

http://www.aaanime.net/pmak/sylphiel/

If I telnet www.aaanime.net 80 and send the following commands:

GET /pmak/sylphiel/ HTTP/1.1
Host: www.aaanime.net
Accept-Encoding: gzip

then the data it sends back is partially gzip, and partially plain
text! It is sent with Transfer-Encoding: chunked and the first 1 or
2 chunks are gzipped, but the rest are not.

The source code for the page is an .shtml file. Inside that .shtml
file, I have the directive !--#exec cgi=navbar.cgi--. Everything
until after that directive is gzipped, then the rest of it is not.

navbar.cgi is a simple perl script that does:
print Content-type: text/html\n\n;

followed by a few more print statements.

Any idea how to fix this problem? Do I need to provide any additional
information in particular? Thanks.



HTTP file uploads with mod_accel

2002-01-06 Thread Philip Mak

Has anyone been using mod_accel on a website that has HTTP file uploads?

I'm having trouble getting file uploads to work with Internet Explorer 5.5,
Netscape 4.7, or Opera 6 through mod_accel 1.0.10. If I access the backend
Apache directly, it works.

I can upload a 1491 byte file, but I can't upload a 13643 byte file (no
matter which web browser I use). When I try to upload the 13643 byte file
through mod_accel, the browser just keeps acting like it's loading the page
and never finishes.

I'm guessing there might be a buffering problem, but I'm not sure how to go
about finding the cause of the problem and fixing it... any suggestions?



Re: HTTP file uploads with mod_accel

2002-01-06 Thread Philip Mak

On Sun, Jan 06, 2002 at 04:16:00PM +0200, Issac Goldstand wrote:
 I use it with uploads and it all works fine.  What I still haven't 
 tested is the UPLOAD_HOOK functionality of Apache::Request under it, but 
 I'll get around to that shortly.

Hmm, I wonder if it's dependent on the CPAN module used to parse the file
uploads.

I'm using MwfCGI.pm (distributed with mwForum), which is a somewhat
modified version of CGI::Minimal.



Fixed (Re: HTTP file uploads with mod_accel)

2002-01-06 Thread Philip Mak

Never mind, I'm an idiot. I just took a look at the error_log of my
frontend and the problem became clear.

[Sun Jan  6 09:42:04 2002] [error] [client 206.173.36.189] (13)Permission denied: 
accel: can't create tempfile /usr/local/apache/cache/tmpFtYxlf



AccelPass interferes with RedirectPermanent

2002-01-04 Thread Philip Mak

The following configuration:

RedirectPermanent /~arcimpulse http://arcimpulse.shoujoai.com
AccelPass / http://127.0.0.1:8002/

did not work as I expected. Instead of being redirected, /~arcimpulse gets
passed to port 8002 (except for URLs that match AccelNoPass), so I had to
put RedirectPermament also in the configuration for the Apache in port
8002.

Is this a bug, or a feature?



DirectoryAccelNoPass in mod_accel

2001-12-31 Thread Philip Mak

Is there a way to specify an AccelNoPass directive (from mod_accel) that
only affects a certain directory?

For example, consider the following scenario:

AccelPass /~user1/ http://127.0.0.1:8001/
AccelNoPass ~*\.gif$ ~*\.jpg$

AccelPass /~user2/ http://127.0.0.1:8002/
AccelNoPass ~*\.gif$

Someone might want to specify separate AccelNoPass settings for those two
directories. It doesn't seem to work when I put it in Directory though;
I get AccelNoPass not allowed here error.

(I don't actually need this functionality at this point and I think it's
an obscure case, but I felt it was worth pointing out.)




Re: mod_accel reverse proxying?

2001-12-28 Thread Philip Mak

On Fri, 28 Dec 2001, Igor Sysoev wrote:

 Yes, it doesn't. It's difficult to figure proxied URL parts in mod_rewrite
 so I have to make explicit directive to specify reverse rewrite.
 I will make it today or tomorrow.

Great!

 I think it should have reverse syntax:

 AccelReverse  http://127.0.0.1:8001/   /

 Or not ? Of course it complicates porting from mod_proxy to mod_accel
 but I think it's clearer then ProxyPassReverse syntax.

I don't think either order is more clearer than the other, but since
ProxyPassReverse has it like / http://127.0.0.1:8001/, my personal opinion
is that AccelPassReverse should have it in the same order too to avoid
confusion.




mod_accel reverse proxying?

2001-12-27 Thread Philip Mak

Does mod_accel have a reverse proxying directive (similar to the
ProxyPassReverse directive in mod_proxy) in order to make redirects work?

I believe the AccelPass directive automatically handles reverse
proxying, but what if I used RewriteRule instead:

RewriteRule ^(.*)\.asp$ http://127.0.0.1:8001/$1.asp [L,P]

That does not setup reverse proxying for me...




Re: [modperl site design challenge] and the winner is...

2001-12-23 Thread Philip Mak

I took a look at the winning design at
http://domm.zsi.at/modperl-site-domm/ and I see a significant problem for
people in 800x600 resolution (which is pretty common still, and I use it):

A horizontal scrollbar appears at the bottom of the screen.

Can't this be fixed? Also, the left sidebar seems to be wider than it
needs to be. The logo is 150 pixels, so I think it only has to be that
wide.

I have attached a screenshot from 800x600 resolution in Opera 6. (The same
thing happens in Internet Explorer 5.5.)

I took a look at the style sheet
(http://domm.zsi.at/modperl-site-domm/style.css) and changed a few numbers
such that the left sidebar takes up less space, and made it so that the
page is allowed to compress smaller such that it'll even fit in 640x480
(about 10% of internet users have that resolution last I checked, although
this proportion may be smaller for programmers).

I attached the changed style sheet; I think it's better this way. You can
see how it looks online at http://sg1.indexthis.net/~pmak/modperl.html.
Under smaller resolutions the horizontal scrollbar is gone, but it'll look
pretty much the same on 1024x768.

BTW, kudos to the designer on making that page without using TABLE tags
(which prevent incremental rendering)! I've tried to figure out how to do
that myself before but didn't manage to do so for pages this complicated.
(If the person viewing the page doesn't have stylesheet support though,
the sidebar will show up at the bottom of the page... Do we have any
statistics on what percentage of people viewing the mod_perl website have
user agents that don't do stylesheets, or Netscape with JavaScript off?)



modperl.gif
Description: Screenshot from 800x600

body {	font-family: helvetica, verdana, sans-serif; 
	font-size:small;
	color: #00; 
	background-color: #ff;

 }	


h1 { 
	padding:2px;
	background-color: #828DA6;
	color:#ff;
}  

a:link { color:#ff;	font-family: helvetica, verdana, sans-serif;}
a:visited {color:#ff;	font-family: helvetica, verdana, sans-serif; }
a:active {color:#ff;	font-family: helvetica, verdana, sans-serif; }
a:hover {color:#ff;	font-family: helvetica, verdana, sans-serif;}

pre { 
 	font-family: courier new, courier, monospace;
  	color: #00;
}

code { 
 	font-family: courier new, courier, monospace;
}

div.navbar a {text-decoration: none; color:#ff;}
div.activenav a {text-decoration: none; color:#ff;}
div.navbarglobal a {text-decoration: none; color:#ff;}
div.notactivenav a {text-decoration: none;	color:#525D76;}
div.toc a {text-decoration: none;color:#ff;}

div.leftcont {
	position:absolute;
	top:5px;
	left:5px;
	width:160px;
}

div.content {
	position:absolute;
	top:5px;
	left:175px;
	margin-right:10px;
	padding:5px;
	border:1px;
	border-style:solid;
	border-color:#525D76;
	background-color: #ff;

}

div.logo {
	padding:5px;
	border:1px;
	border-style:solid;
	border-color:#525D76;
	background-color: #ff;
	text-align:center;
}


div.navbar {
	padding:2px;
	border:0px;
	border-style:solid;
	border-color:#525D76;
	background-color: #ff;
}

div.activenav {
	font-weight:bold;
	padding:2px;
	background-color: #525D76;
}

div.notactivenav {
	padding:2px;
	font-weight:bold;
	border:1px;
	border-style:solid;
	border-color:#525D76;
	margin-top:-1px;

}

div.navbarglobal {
	padding:2px;
	background-color: #525D76;
	color:#ff;
}

div.navbarlocal {
	padding:2px;
	text-align:center;
}

div.tail {
	padding-top:10px;
	padding:5px;
	border:1px;
	border-style:solid;
	border-color:#525D76;
	background-color: #ff;
}


div.ad {
	border:1px;
	padding:5px;
	border-style:solid;
	border-color:#525D76;
	background-color: #ff;
}



Re: Report on mod_accel and mod_deflate

2001-12-20 Thread Philip Mak

On Thu, 20 Dec 2001, Jeremy Howard wrote:

 Note that mod_accel can also be called by utilising the mod_rewrite [P]
 directive, just like with mod_proxy.

If I put [P] in a RewriteRule, how does Apache know whether I want it to
use mod_proxy or mod_accel?

 AccelSet* adds X-* headers to the request to the backend. This is useful to
 know what the original request details were.

In ftp://ftp.lexa.ru/pub/apache-rus/contrib/ (where I have been told to
download mod_accel/mod_deflate from before), I see another file called
mod_realip-1.0.tar.gz just released one week ago. From looking at the
keywords in the documentation, it looks like a module to be installed on
the backend httpd that will parse these X-* headers to retrieve the
original IP address.

 By default only text/html is compressed.

I think it's safe to compress text/plain by default, too; I've never seen
any browser problems with compressed text/plain (only text/js and
text/css).




RFC: Security/Performance Best Practices (long)

2001-11-11 Thread Philip Mak

Recently, I've been using Apache::ASP to program a new version of an
existing website that gets over 5 million page views per month. This
website will have to fit on a RaQ4i (450MHz) server, so I'm pretty
conscious about performance. Security is also important due to the
popularity of the site.

I've read various documentation and combined them together into the
following strategy for security and performance on a mod_perl driven
website. I haven't seen these combined strategies formally written up
anywhere, so I thought I would try to do that and ask you guys for
suggestions. This is a bit unorganized right now, but all the general
concepts should be there. The goal is to produce a document that
explains all the principles, and shows all the configuration
directives required to accomplish this.

This website runs off a MySQL database. Although all the webpages are
generated dynamically, they don't change often (unless the webmaster
explicitly updates them).

I setup a lightweight frontend httpd (port 80) that proxies to a
heavyweight mod_perl backend httpd (port 8001). mod_gzip is installed
on the frontend to deliver compressed HTML pages for faster download
time. mod_proxy_add_forward is also installed so that the backend logs
the true IP address of the request in its logs.

In my account, I have these directories:

httpd: apachectl, httpd.conf, logs for the mod_perl httpd
perl: DocumentRoot for backend httpd
web: DocumentRoot for frontend httpd
global: contains *.pm, startup.pl, global.asa (for Apache::ASP)

The proxying is configured in the frontend httpd.conf as follows:

1: RewriteEngine On
2: RewriteRule ^/(.+)\.asp$ http://127.0.0.1:8001/$1.asp [L,P]
3: RewriteRule ^/(.+)\.pl$ http://127.0.0.1:8001/$1.pl [L,P]
4: RewriteCond /home/aw/perl%{REQUEST_URI}index.asp -f
5: RewriteRule ^(.*)/$ http://127.0.0.1:8001$1/ [L,P]

Line 2 passes any URL with a .asp extension to the backend.
Line 3 passes any URL with a .pl extension to the backend.
Line 4,5 passes any request for a directory to the backend, if there
is an index.asp file in that directory.

Notice that to the outside world, the hostname/port of the website is
exactly the same whether it's being served by the frontend or
backend. I prefer this approach since it lets my img src tags refer
to images in the same directory, for example. It also doesn't require
an extra DNS lookup on the client end (which it would if the mod_perl
server and non-mod_perl server were on different hostnames).

I don't have a ProxyPassReverse directive since I haven't thought
about it; I wouldn't need it anyway since I don't do any redirecting
(at least not right now), but I'll probably end up adding it just in
case.

The following users were created on the system:

aw: I login as this user. Group = aw, httpd
aw_guest: mod_perl httpd runs as this user. Group = aw
httpd: lightweight httpd runs as this user. Group = httpd

aw owns all of the files except httpd/logs.

The web directory is world readable. It only contains images that
everyone can get from the web server anyway.

The httpd and global directories are group readable, so only aw
and aw_guest can read it. perl is world readable, but the files
inside are only group readable (this allows the httpd user to tell
what files exist, but nothing more). This protects my source code
(and the database passwords they contain!) from being browsed by
others.

So that I won't accidentally create world readable files, I have this
line in ~/.profile for aw:

umask 027

This creates files as rw-r- by default. Files I upload by FTP
still default to mode rw-r--r--, but I only upload image files that
way (I use vi through ssh to edit the code) so that's perfect.

There is a level of isolation here; in case I write an insecure script
that gets hacked, the hacker will only gain access to the aw_guest
account. The aw_guest account can read all my site's files, but it
can't write to any of them. Also, the MySQL username/password used by
the website has read-only access to the database.

Apache::ASP is set so that every page has headers indicating that it
can be cached for up to one hour:

  $Response-AddHeader('Last-Modified', time2str(time));
  $Response-{CacheControl} = 'public';
  $Response-{Expires} = 3600;

I could have set the expiry time higher, but I decided to put it at
3600 so that in case I change content on the website and forget to
manually clear the cache, it won't be out of date by more than 1
hour. In terms of performance issues, 1 hour should be long enough
such that the backend httpd server doesn't have to do too much work.

In my frontend httpd server, I have a basic cache configuration:

ProxyRequests on
CacheRoot /home/httpd/cache
CacheSize 1 # cache size of 10 MB
CacheGcInterval 1 # clean up the cache every hour
CacheMaxExpire 24 # nothing lives in the cache for  24 hours
CacheDefaultExpire 1 # default expiry time is 1 hour

I can force the frontend httpd server to reload a specific page from
the backend by 

ProxyPass and DirectoryIndex

2001-11-09 Thread Philip Mak

On port 80, I'm running a non-mod_perl httpd.
On port 8001, I'm running a mod_perl httpd.

Port 80 is ProxyPassing to port 8001 like this:
RewriteRule ^/(.+)\.asp$ http://127.0.0.1:8001/$1.asp [p]

The httpds have different DocumentRoots however, so if I visit
http://mysite.com/ it will return a directory index rather than calling
the index.asp file.

My current solution is to touch index.asp in the port 80 DocumentRoot
and have DirectoryIndex index.asp so that it knows to ProxyPass those
requests. I'd have to touch index.asp manually for every directory,
though. Is there a better way around this?




Re: ProxyPass and DirectoryIndex

2001-11-09 Thread Philip Mak

  My current solution is to touch index.asp in the port 80 DocumentRoot
  and have DirectoryIndex index.asp so that it knows to ProxyPass those
  requests. I'd have to touch index.asp manually for every directory,
  though. Is there a better way around this?

 RewriteRule ^/$ http://127.0.0.1:8001/ [p]

That would only pass the main directory; it won't take care of this
problem for the subdirectories.

 Why do you use RewriteRule instead of ProxyPass ?
 ProxyPass/http://127.0.0.1:8081/
 Or do you have static files that you don't want to pass to mod_perl ?

I have static files that I don't want to pass to mod_perl.

 RewriteRule ^(.*)/$   http://127.0.0.1:8001$1/index.asp [p]

That looks like it will ProxyPass every directory to the mod_perl enabled
httpd. It would make index.html not work anymore, though. I think the
optimal solution would:

- display index.html if it is present in the non-mod_perl web directory
- else display index.asp if it is present in the mod_perl web directory
- else display a directory listing (if Options +Indexes is on)

I'm thinking that this is not possible, at least not without having to
make some really ugly configuration hack?

 You can try with my mod_accel:
 ftp://ftp.lexa.ru/pub/apache-rus/contrib/mod_accel-1.0.6.tar.gz

 AccelCacheRoot  cache
 AccelNoCacheon
 AccelPass   /  http://127.0.0.1:8081/
 AccelNoPass ~*\.jpg$   ~*\.gif$

Hmm, so that would pass any URL that doesn't end in .jpg or .gif. While
not semantically equivalent to what I'm doing, I think it would actually
work for my case (I just have to specify what extensions NOT to pass).




Re: ApacheBench says my site is unstable?

2001-10-29 Thread Philip Mak

On Mon, 29 Oct 2001, Joshua Chamas wrote:

  Complete requests:  1000
  Failed requests:22
 (Connect: 0, Length: 22, Exceptions: 0)

 If ApacheBench complains about length problems, it means
 that the length of subsequent requests differs from the
 output length of the first request, so dynamic content usually
 screws up ab's response in this way.

I thought about that, but the test script just prints Hello world every
time (static length). The failed requests got 0 bytes, according to the
access_log.

I haven't figured out this problem yet, but I'm hoping it's a problem with
ab (after all, it really shouldn't just crash with Broken pipe like
that...).




ApacheBench says my site is unstable?

2001-10-27 Thread Philip Mak

I'm using ApacheBench to perform stress testing on my mod_perl server.
It's not always working, though. Observe the following two runs: (first is
Broken pipe; second has some failed requests)

[pmak@sg1 bin]$ ./ab -n 1000 -c 10 http://65.119.108.120:8080/
This is ApacheBench, Version 1.3c $Revision: 1.45 $ apache-1.3
Copyright (c) 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Copyright (c) 1998-2000 The Apache Group, http://www.apache.org/

Benchmarking 65.119.108.120 (be patient)...Broken pipe

[pmak@sg1 bin]$ ./ab -n 1000 -c 10 http://65.119.108.120:8080/
This is ApacheBench, Version 1.3c $Revision: 1.45 $ apache-1.3
Copyright (c) 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Copyright (c) 1998-2000 The Apache Group, http://www.apache.org/

Server Software:Apache/1.3.22
Server Hostname:65.119.108.120
Server Port:8080

Document Path:  /
Document Length:13 bytes

Concurrency Level:  10
Time taken for tests:   21.109 seconds
Complete requests:  1000
Failed requests:22
   (Connect: 0, Length: 22, Exceptions: 0)
Total transferred:  196578 bytes
HTML transferred:   12714 bytes
Requests per second:47.37
Transfer rate:  9.31 kb/s received

Connnection Times (ms)
  min   avg   max
Connect:   99   101   140
Processing: 5   107   364
Total:104   208   504

Whenever I try to load that URL in my browser, it works so I think it has
something to do with the performance of httpd under load:

[pmak@sg1 bin]$ lynx -dump http://65.119.108.120:8080/

   Hello, world!

Looking in my access_log, I see failed requests like this:

66.33.60.115 - - [25/Oct/2001:22:20:43 -0700] GET / HTTP/1.0 200 0 - 
ApacheBench/1.3d

and successful requests like this:

66.33.60.115 - - [27/Oct/2001:21:31:32 -0700] GET / HTTP/1.0 200 13 - 
ApacheBench/1.3c

Does anyone have an idea what's going on? I can't figure out why some
requests seem to return 0 bytes at random, or why ApacheBench crashes with
Broken pipe. There is nothing in the VirtualHost or the serverwide error
log other than the MaxClients warning:

66.33.60.115 - - [27/Oct/2001:21:31:32 -0700] GET / HTTP/1.0 200 13 - 
ApacheBench/1.3c

MaxClients is set to 50, btw.

server1# uname -a
FreeBSD server1.buildreferrals.com 4.2-RELEASE FreeBSD 4.2-RELEASE #0: Fri Oct 12 
13:36:14 PDT 2001
[EMAIL PROTECTED]:/usr/src/sys/compile/LOCAL  i386

Apache version 1.3.22, mod_perl verison 1.26

Is there a quirk of FreeBSD that I have to account for, perhaps? This is
my first time setting up Apache on a FreeBSD system (I've always used Red
Hat Linux or SunOS before).




What hourly rate to charge for programming?

2001-10-02 Thread Philip Mak

I've had about two years of experience with perl, and one year of
experience with mod_perl and MySQL.

I've been doing contract programming jobs for people and charged by the
hour. The rate I currently charge them ($40) was kind of chosen randomly.
I'd like to find out if this figure is too high/too low. Does anyone here
have any experiences to share?




Re: Connection Reset on Mandrake Linux 8.0 / Apache 1.3.20 / ModPerl1.26

2001-08-25 Thread Philip Mak

I tried telneting to your web server to see what's going on. Look at this:

$ telnet www.nonserviam.net 80
Trying 65.34.152.103...
Connected to nonserviam.net.
Escape character is '^]'.
GET /modperl/index.pl HTTP/1.1
Host: www.nonserviam.net

Hello!Connection closed by foreign host.

The HTTP server did not return a proper HTTP header, which is why lynx
doesn't like it.

Looking at your mod_perl configuration...

PerlSendHeader Off

Try doing it with PerlSendHeader On. You also might need to add this line
at the beginning of your script:

print Content-type: text/plain\n\n;

(or text/html)




Why do RaQ4is run mod_perl so slowly?

2001-08-18 Thread Philip Mak

I have a RaQ4i server (450MHz AMD K-6 processor). If I have 20 mod_perl
httpd processes running concurrently, then the system's load average goes
up over 10.0 and CPU usage is 100%. The machine has RAM to spare, so
swapping is not the problem.

Is that the norm for a 450MHz server, or is there something I can do to
make it work better?





What counts as a real DBMS?

2001-08-01 Thread Philip Mak

On Wed, 1 Aug 2001, Henrik Edlund wrote:

 And while we are discussing not cutting corners, those who still use
 MySQL should switch to a real DBMS before they even think of abstracting
 the SQL away from their Perl code.

 That people still use MySQL really shows how many lusers there are with
 computers that try to develop real software. I said _try_.

What would you consider to be a real DBMS? Sybase and Oracle obviously,
but I actually am the hypothetical programmer with a 233MHz machine with
64 MB RAM (hey, it runs emacs fine :/) on a shoestring budget who is
mostly limited to using freeware tools.

What about PostgreSQL and Interbase? Do those have the features of a
'real' DBMS?




require v.s. do in modperl

2001-08-01 Thread Philip Mak

I have a CGI application where I do:

require 'db.pl';

where db.pl defines some functions and variables related to connecting to
the database, and then executes C$dbh = DBI-connect(...).

I tried to convert this application to modperl, but I ran into the problem
that require did not execute db.pl again the second time I called the
script, so that the C$dbh = DBI-connect(...) line was not executed.

I can get around this by changing Crequire to Cdo, but is that the
correct way of doing things? It seems a waste to redefine all the
subroutines and variables again. But I do need it to reinitialize $dbh
when Crequire 'db.pl'; is called.

What should I do?




Why can't Apache::Reload work 100% transparently?

2001-07-31 Thread Philip Mak

On Tue, 31 Jul 2001, Kyle Oppenheim wrote:

 Apache::Reload works by performing a stat on every file in %INC and calling
 require for all the files that changed.  It's quite possible that some of
 the files in %INC are using relative paths (often '.' is in @INC).  So, Perl
 was able to load the file originally because the initial 'use' or 'require'
 was after Apache changed to your directory.  However, when Apache::Reload
 goes to look for the file, it can't find it because the current directory is
 different (most likely the ServerRoot).

I've ran into this problem with Apache::Reload a couple of times myself.

Isn't there a way that Apache::Reload can be made to work transparently
(in the spirit of making programs Do The Right Thing (tm))?  Perhaps by
overloading the use and require functions to convert pathnames to be
fully qualified before inserting them in %INC?

(I think this would also help with same-named mod_perl scripts from
different VirtualHosts in the same instance of Apache interfering with
each others' execution?)




Re: Using Apache::Reload in development environment?

2001-07-30 Thread Philip Mak

On Mon, 30 Jul 2001, Stas Bekman wrote:

 no (re-)?read the manpage. it's all there.

 It's possible that Matt wants to add other options to the SYNOPSIS
 section, as not everybody bothers to read the manpage. I think people are
 used to see all of the functionality covered in SYNOPSIS.

In perldoc Apache::Reload, the DESCRIPTION has the following sections:

- StatINC Replacement
- Register Modules Implicitly
- Register Modules Explicitly
- Special Touch File

I just re-read it again and realized that StatINC Replacement is what I
wanted... although it wasn't obvious from just reading the header since I
don't know what StatINC is. Re-reading the first paragraph of DESCRIPTION,
I see that it implies that StatINC always checks the modification time of
the file.

I was also somewhat confused as to what PerlSetVar ReloadAll does since
that perldoc page did not have an explicit definition, only examples of it
being used.




RewriteRule Proxy problems

2001-07-30 Thread Philip Mak

I have a front-end lightweight Apache proxying Apache::ASP scripts to a
backend mod_perl Apache. I am experiencing problems with query strings.

In my lightweight httpd.conf, I have:

RewriteRule ^/(.*)\.asp http://66.33.85.239/$1.asp [p]

If I go to http://www.buildreferrals.com/rotatorstats.asp, it gets proxy'd
correctly.

But if I go to http://www.buildreferrals.com/rotatorstats.asp?login=pmak0
(that's the same URL, but with a query string added), then I get a 404
Not Found error. The error log says:

[Sun Jul 29 08:10:11 2001] [error] [client 206.173.59.73] File does not
exist: proxy:http://66.33.85.239/rotatorstats.asp?login=pmak0

Does anyone know what I'm doing wrong? That error message seems strange
because if I paste the http://66.33.85.239/rotatorstats.asp?login=pmak0
URL in my browser it will load. Looking at the logs for 66.33.85.239, it
never even received a request from the frontend server when I got the 404
Not Found.

I've also tried:
RewriteRule ^/(.*\.asp(\?.*)?$) http://66.33.85.239/$1 [p]

but it gives the same error message.




Re: segfault with mod_perl, Oraperl, XML::Parser

2001-07-30 Thread Philip Mak

On Mon, 30 Jul 2001, Scott Kister wrote:

 uselargefiles=define

Have you tried turning off uselargefiles?

I might be off track here, but recently I tried to install mod_perl on
Solaris 5.8. It kept segfaulting until I turned off uselargefiles and
binary compatibility with 5.00503. You could try recompiling perl with
this configure line, then recompiling mod_perl and see what happens:

sh Configure -des -Dcc=gcc -Ubincompat5005 -Uuselargefiles




Re: RewriteRule Proxy problems

2001-07-30 Thread Philip Mak

On Mon, 30 Jul 2001, Perrin Harkins wrote:

  But if I go to http://www.buildreferrals.com/rotatorstats.asp?login=pmak0
  (that's the same URL, but with a query string added), then I get a 404
  Not Found error.

 Of course you do. Your regex ^/(.*)\.asp doesn't match that URL with
 the query string.

Why not? I did not put a $ at the end of the regexp so it should still
match. I've also tried:

RewriteRule ^/(.*)\.asp(.*) http://66.33.85.239/$1.asp$2 [p]

but got the same 404 Not Found error.




Using Apache::Reload in development environment?

2001-07-29 Thread Philip Mak

I have a site running mod_perl that I'm constantly making changes to.

What do I have to do in order to make it so that when I edit any file
(either a .pl script directly called on the site, or a .pm module that my
perl script uses), then the changes will take effect automatically? I
would rather not have to go into each file manually and put use
Apache::Reload.

Do I just put

PerlInitHandler Apache::Reload

in httpd.conf? Is there anything else that I have to do?




Re: [ANNOUNCE] Hello World Benchmarks, updated

2001-07-11 Thread Philip Mak

One thing caught my eye; how come mod_perl handler (808.4 hits per
second) performed better than HTML static (768.2 hits per second)?

And sorry for my newbie-ish question, but what is the difference
between mod_perl handler and Apache::Registry mod_perl?





Using mod_perl handlers for max speed?

2001-07-11 Thread Philip Mak

In the recent Hello World 2000 benchmark posted by Joshua Chamas, mod_perl
handler was shown to be even faster than static HTML (at least for running
hello world), and twice as fast as using Apache::Registry to run a perl
script.

Does this mean that if there's a heavily used script on my system that
needs to be VERY fast, then it may be worth making it into a mod_perl
handler? What are the caveats of using mod_perl handlers instead of normal
scripts?

For those who didn't see it, here is the code for the Hello World mod_perl
handler program. It is inserted into httpd.conf directly.

Files ~ (hello\.bench)
Perl
   # ModPerl Handler
 package Apache::bench;
 sub handler {
   my(\$r) = shift;
   \$r-content_type('text/html');
   \$r-send_http_header();
   \$r-print('Hello ');
   \$r-print('World');
   200;
 }
   1;
/Perl
SetHandler perl-script
PerlHandler Apache::bench
/Files






Re: Directory Restrictions

2001-06-27 Thread Philip Mak

On Wed, 27 Jun 2001, will trillich wrote:

 okay -- but if you want some of your site to be indexed by the
 standard mod_autoindex, yet have mod_perl intervene for certain
 subtrees, you'll find that mod_perl never gets a chance at it
 because the mod_autoindex gadjets catch it at an earlier stage.
 i think.

How about using RewriteRule? For example, you can do:

RewriteRule /somedir/ index.pl

and then when people visit http://your-site.com/somedir/, it will call
index.pl. index.pl can use $ENV{REQUEST_URI} to determine which directory
to display.




RE: Make Test problems...

2001-06-17 Thread Philip Mak

On Sun, 17 Jun 2001, Ian (the webguy) wrote:

 I did a killall httpd as root then tried it again, but to no evail.

I don't think that httpd already being running was the problem. The test
script is supposed to pick a port that's NOT in use for the purposes of
the test, isn't it?

 still waiting for server to warm up...not ok
 server failed to start! (please examine t/logs/error_log) at t/TEST line 95.
 make: *** [run_tests] Error 9

Do what it says. Examine the file called t/logs/error_log. It should be
somewhere in the directory where you're compiling this stuff.

If you don't understand the contents of the error log, try posting them
here.

-Philip Mak ([EMAIL PROTECTED])




Is ProxyPass the best you can do?

2001-06-16 Thread Philip Mak

I've been thinking about the ProxyPass technique for coping with
mod_perl's high memory usage (setup a non-mod_perl httpd that handles all
requests, but ProxyPasses the mod_perl calls to a mod_perl enabled
Apache).

I find that the complexity of this method is more than it should have to
be. For one thing, ProxyPass only works on a directory. But if you have
images and scripts in the same directory, this is a problem (and it's
convenient to be able to have them in the same directory, so that your
scripts can a href=image.jpg instead of a href=/images/image.jpg
especially when you have a lot of images in different directories).

Is there a way to ProxyPass by file extension or something?

-Philip Mak ([EMAIL PROTECTED])




Re: Is ProxyPass the best you can do?

2001-06-16 Thread Philip Mak

On Sun, 17 Jun 2001, Martin Redington wrote:

 Squid is the alternative mentioned in the mod_perl_tuning.pod that comes
 with mod_perl.

Can Squid read Apache configuration files? On a new site I'm making
(www.shoujoai.com), I have directives in httpd.conf like this:

RewriteRule ^/fanfics/([a-zA-Z_0-9\-]+)/$ /fanfics/series.asp?series=$1

so that viewing http://www.shoujoai.com/fanfics/*/ actually calls an
Apache::ASP script. But, only by reading the httpd.conf would one be able
to tell that it's a script instead of a normal directory.

 Alternatively, you could try using mod_rewrite, to direct requests for
 scripts to a different apache instance (e.g. running on a separate port
 or ip). I've never tried this, but it should work.

You can use RewriteRule to make it proxy the request to another Apache? I
thought you can only alias a URL to a file, or make it send an HTTP
redirect. How do you make it proxy?

-Philip Mak ([EMAIL PROTECTED])




Re: mod_perl DSO leaking on restart?

2001-06-14 Thread Philip Mak

On Thu, 14 Jun 2001, Doug MacEachern wrote:

 repeat
 1.21_01 had two dso fixes, one to close all .so's opened by DynaLoader and
 one to call perl_shutdown(), both of which were large leaks.  with
 1.25_01-dev and Perl 5.6.1 i see a 4k growth on the first kill -USR1 and
 no change after that.  what is your perl -V and mod_perl version?
 /repeat

 if people are seeing leaks on restart using Perl 5.005_03 i am not
 surprised, 5.6.1 plugs a great many leaks.

Perl 5.005_03, mod_perl 1.25.

I have since fixed the memory leak problem by recompiling mod_perl so that
it is statically linked to Apache instead of as a DSO.

-Philip Mak ([EMAIL PROTECTED])




[OT] How to write this perl sub w/o variables?

2001-04-29 Thread Philip Mak

Is it possible to rewrite this perl subroutine without using variables?

sub XMLEncode {
my ($line) = @_;
$line =~ s//amp;/g;
$line =~ s//lt;/g;
$line =~ s//gt;/g;
return $line;
}

I was thinking something like

sub XMLEncode {
s//amp;/g;
s//lt;/g;
s//gt;/g;
return $_;
}

but I can't get it to work like that.

-Philip Mak ([EMAIL PROTECTED])




brochureware perl.apache.org?!

2001-04-28 Thread Philip Mak

On Sat, 28 Apr 2001, Drew Taylor wrote:

 I agree 100%. If I might throw my $.02 in, IMHO a part of this marketing
 should be a more brochureware perl.apache.org.

If you guys do redesign perl.apache.org, please, PLEASE take usability
into account. http://www.useit.com/ is a great resource about how to make
a usable website.

I've seen too many corporate websites that load slow, and is hard to find
any *useful* information in them. A bad offender that I can think of off
the top of my head is http://www.advertising.com/.

-Philip Mak ([EMAIL PROTECTED])




Re: Apache::ASP extra newline in script output start - killing IEpdf recognition

2001-04-27 Thread Philip Mak

On Fri, 27 Apr 2001, Joel W. Reed wrote:

 %@ LANGUAGE=PerlScript %
 %
   do neat perl things
 %

Have you tried this:

@ LANGUAGE=PerlScript %%
do neat perl things
%

-Philip Mak ([EMAIL PROTECTED])




Failed requests in benchmark

2001-04-27 Thread Philip Mak

$ ./ab -n 100 -c 10 http://www.animelyrics.com/;
This is ApacheBench, Version 1.3a
...
Time taken for tests:   7.189 seconds
Complete requests:  100
Failed requests:11
   (Connect: 0, Length: 11, Exceptions: 0)
Total transferred:  671524 bytes
HTML transferred:   646289 bytes
Requests per second:13.91
Transfer rate:  93.41 kb/s received
...

Why is the server returning so many Failed requests at this low load?
(That webpage is generated by Apache::ASP which connects to MySQL.)

A mod_perl script (no Apache::ASP) that connects to MySQL on the same
server gets 0 failed requests and 18.83 requests per second when I run ab
with -n 1000 -c 50.

-Philip Mak ([EMAIL PROTECTED])




thttpd v.s. boa (Re: ANNOUNCE: mod_perl guide ver. 1.29)

2001-04-27 Thread Philip Mak

On Sat, 28 Apr 2001, Stas Bekman wrote:

 * strategy.pod:

   o added a ref to a light and fast Boa webserver

The strategy guide mentions thttpd, khttpd and Boa. khttpd doesn't look to
be production quality yet (its website says that it can crash the kernel),
so that leaves thttpd and Boa.

Which one would be better to use? Here's what I know so far:

- Someone's reported thttpd using over 100 MB of memory, and suggested
  to switch to Boa instead. (the message is in the thttpd mailing list
  archives somewhere... February 2001 I think)

- thttpd's website shows benchmarks where thttpd handles 720 requests per
  second, while Boa only handles 475.

- thttpd supports chroot and throttling. Boa does not.

-Philip Mak ([EMAIL PROTECTED])




Re: Environment variables in startup.pl

2001-04-27 Thread Philip Mak

On Fri, 27 Apr 2001, Scott Alexander wrote:

 Should this work in a startup.pl file

 my $hostname = $ENV{HOSTNAME} ;

 from the prompt I can write echo $HOSTNAME and get the correct
 hostname of the server.

 But from within startup.pl I don't get it.

The reason echo $HOSTNAME works from the prompt is because /etc/profile
contains the command HOSTNAME=`/bin/hostname`. When you're in a
non-interactive environment, that's not available.

Try this:

my $hostname = `/bin/hostname`;

-Philip Mak ([EMAIL PROTECTED])




$dbh-disconnect with Apache::DBI? (was Re: Failed requests inbenchmark)

2001-04-27 Thread Philip Mak

On Fri, 27 Apr 2001, Joshua Chamas wrote:

  This is ApacheBench, Version 1.3a
  Failed requests:11
 (Connect: 0, Length: 11, Exceptions: 0)

 My experience with ab is that it needs content to be returned
 of identical length from one request to the next, so if your
 content is dynamic in any way, it may fail.

 If there are any real Apache::ASP errors, they should show up
 in your apache error_log.

Thanks for clearing that up. There are no errors in my
perlhttpd.error_log, and when I changed the test script so that it
displays the exact same content every time, there were no more failed
requests.

I noticed something weird in my database logs, though:

010427 22:36:41  Aborted connection 2544 to db: 'animelyrics' user:
'animel' host: `localhost' (Got an error reading communication packets)
010427 22:37:14  Aborted connection 2546 to db: 'animelyrics' user:
'animel' host: `localhost' (Got an error reading communication packets)
010427 22:54:11  Aborted connection 2601 to db: 'animelyrics' user:
'animel' host: `localhost' (Got an error reading communication packets)

Reading some other mailing list messages suggests that I did not do
$dbh-disconnect() properly. But I'm using Apache::DBI, so should I need
to do that?

-Philip Mak ([EMAIL PROTECTED])





Re: Is this startup.pl ok?

2001-04-26 Thread Philip Mak

On 26 Apr 2001, Dave Hodgkinson wrote:

  As time goes on, these processes' memory usage grows and grows. Right now
  they're 20 MB (uptime 2 days). When I rebooted the machine two days ago,
  they were using 80 MB each (shared memory, though). MaxRequestsPerChild is
  set to 200.

 What operating system?

Apache/1.3.12 (Unix)
mod_perl/1.24
perl 5.005_03
Linux 2.2.14-5.0; Red Hat Linux release 6.2 (Zoot)

 I'd be inclined to stuff a lot more of the generic modules you use
 (CGI, Template Toolkit, URI, Date modules) into startup.pl. The more
 the merrier.

Ok. I didn't think of that.

 If a process starts at 10M and grows to 80M that's 70M per process,
 _unshared_ for sure. Not good.

I thought it was shared, because under top, SHARE was almost as big as
RSS.

-Philip Mak ([EMAIL PROTECTED])




Is this startup.pl ok?

2001-04-25 Thread Philip Mak

My Apache with modperl is acting weird with respect to memory usage.

When it first starts up, each process uses 10 MB of memory.

As time goes on, these processes' memory usage grows and grows. Right now
they're 20 MB (uptime 2 days). When I rebooted the machine two days ago,
they were using 80 MB each (shared memory, though). MaxRequestsPerChild is
set to 200.

Here are my startup.pl files; I was wondering if they were correct? (The
startup.pl in /home/animel is inside a VirtualHost container.)

root@trapezoid [/etc/httpd/conf]# egrep startup.pl *.conf
httpd_modperl.conf:PerlRequire /usr/local/apache/conf/startup.pl
httpd_modperl.conf:PerlRequire /home/animel/www/include/startup.pl
root@trapezoid [/etc/httpd/conf]# cat /usr/local/apache/conf/startup.pl
#!/usr/bin/perl

use strict;

use DBI ();
use DBD::mysql ();

1;
root@trapezoid [/etc/httpd/conf]# cat /home/animel/www/include/startup.pl
#!/usr/bin/perl

# For security reasons, this file is owned by root.

use lib qw(/home/animel/www/include);

1;

-Philip Mak ([EMAIL PROTECTED])




mod_perl DSO leaking on restart?

2001-04-25 Thread Philip Mak

On Thu, 26 Apr 2001, Stas Bekman wrote:

  There is also the strange case of mod_perl leaking memory on graceful
  restarts when compiled as DSO.  But I don't feel like getting into
  this one quite yet.

Hmm. My httpd was using 20 MB. I did apachectl graceful ten times, and
the usage jumped to 24 MB. Then I did apachectl graceful another ten
times, and the usage jumped to 29 MB.

I guess that's the reason (or one of them) that my httpd grows bigger and
bigger as time passes. My mod_perl is a DSO (I run two copies of httpd,
one without mod_perl and one with; I set it up as a DSO since this way I
only need one executable). Should I recompile it statically linked?

-Philip Mak ([EMAIL PROTECTED])




(apache question) Working around MaxClients?

2001-02-22 Thread Philip Mak

Hello,

I have a high traffic website (looks like 200 GB output per month,
something around 10-20 hits per day) hosted on a commercial
service. The service does not limit my bandwidth usage, but they limit the
number of concurrent Apache process that I can have to 41. This causes the
server to delay accepting new connections during peak times.

My account is a "virtual server"; what this means is that I have access to
the Apache httpd.conf files and can restart the Apache daemon, but do not
have the priviledge to bind a program to port 80 (so I can't put thttpd on
port 80).

I was thinking of serving the HTML files from Apache and the JPG files
from thttpd (thttpd uses select() so it always only uses up one process,
no matter how many connections it's handling) on port 8080, but there's
one disadvantage: People who browse my site from behind certain firewalls
can only see port 80.

Does anyone know of a way to configure Apache so that it will pass port 80
traffic onto port 8080 somehow, without having access to modify the
binary? It would have to do this without needing to spawn a child for
every request though. Or is this impossible?

Thanks,

-Philip Mak ([EMAIL PROTECTED])

P.S. Is there a mailing list for general Apache questions somewhere? I
can't seem to find one.




Re: [OT] RE: (apache question) Working around MaxClients?

2001-02-22 Thread Philip Mak

# Doesn't work. Children still get tied up serving requests.
#ProxyPass / http://www.animewallpapers.com:8080/
#ProxyPassReverse / http://www.animewallpapers.com:8080/

That doesn't get me around the limit of 41 Apache processes...

-Philip Mak ([EMAIL PROTECTED])

On Thu, 22 Feb 2001, Stathy Touloumis wrote:

 Why don't you setup apache to do proxying?
 
  I have a high traffic website (looks like 200 GB output per month,
  something around 10-20 hits per day) hosted on a commercial
  service. The service does not limit my bandwidth usage, but they limit the
  number of concurrent Apache process that I can have to 41. This causes the
  server to delay accepting new connections during peak times.
 
  My account is a "virtual server"; what this means is that I have access to
  the Apache httpd.conf files and can restart the Apache daemon, but do not
  have the priviledge to bind a program to port 80 (so I can't put thttpd on
  port 80).
 
  I was thinking of serving the HTML files from Apache and the JPG files
  from thttpd (thttpd uses select() so it always only uses up one process,
  no matter how many connections it's handling) on port 8080, but there's
  one disadvantage: People who browse my site from behind certain firewalls
  can only see port 80.
 
  Does anyone know of a way to configure Apache so that it will pass port 80
  traffic onto port 8080 somehow, without having access to modify the
  binary? It would have to do this without needing to spawn a child for
  every request though. Or is this impossible?




httpd takes 86 MB memory

2001-02-17 Thread Philip Mak

Recently, my machine got an upgrade from 128 MB RAM to 386 MB RAM.

The modperl enabled httpd process used to take up less than 10 MB each.
But now, after the memory upgrade it is suddenly taking up 86 MB. Here is
an excerpt from "top" (sorted by memory usage):

  PID USER PRI  NI  SIZE  RSS SHARE STAT  LIB %CPU %MEM   TIME COMMAND
 3746 tuxedo15   0  139M 128M   964 R   0 95.6 34.1  45:22 wusage
 7257 nobody 0   0  103M  86M 84944 S   0  0.0 22.9   0:00 httpd
 7253 nobody 0   0  102M  86M 85028 S   0  0.0 22.9   0:00 httpd
 7263 nobody 0   0  102M  86M 85032 S   0  0.3 22.9   0:00 httpd

Does anyone have suggestions on how to find out the problem/fix it?

The httpd.conf has not been modified in a month, and the memory upgrade
was done just three days ago. So, AFAIK the only thing on the machine that
has changed is the amount of RAM.

-Philip Mak ([EMAIL PROTECTED])




httpd keeps crashing overnight

2001-01-19 Thread Philip Mak

Hello,

On my machine, I am running two instances of Apache. They both use the
same executable, but different config files; one has "AddModule
mod_perl.c" and the other one doesn't.

I used to only run one instance of Apache with the same executable as I
have now that was mod_perl enabled. Back then, it was stable.

My problem is that the mod_perl httpd is sometimes crashing overnight. In
the last three days, it has mysteriously crashed twice. When I restart it
with "apachectl_modperl start" (apachectl_modperl is just apachectl but
with the config file path set differently), it comes up with no problem,
but I suppose it might crash again in the future.

Examination of the error_log after I restart it only shows:

[Fri Jan 19 04:44:27 2001] [error] [asp] [4941] [WARN] redefinition of
subroutine Apache::ASP::Compiles::_tmp_global_asa::display_footer at
Apache::ASP::Compiles::_tmp_global_asa::_home_sakura_linazel_index_ssixINL,
originally defined at
Apache::ASP::Compiles::_tmp_global_asa::_home_sakura_linazel_reasons_aspxINL
[Fri Jan 19 08:35:08 2001] [warn] pid file
/usr/local/apache/logs/httpd_modperl.pid overwritten -- Unclean shutdown
of previous Apache run?

which doesn't give me any idea why it crashed. I also tried doing a `find
/ -name "core"` but did not find any core files in a directory that seems
to be related to Apache.

Running "uptime" shows that the server has been up all along, and the
non-mod_perl enabled Apache is running fine.

Does anyone know how I can go about tracking the cause of the crash?

Thanks,

-Philip Mak ([EMAIL PROTECTED])




Issuing rollback() for database handle being DESTROY'd

2000-12-28 Thread Philip Mak

I recently noticed some strange error messages in my error_log. This is
from animelyrics.com, which uses Apache::ASP and MySQL.

Issuing rollback() for database handle being DESTROY'd without explicit
disconnect() at (eval 9) line 22.

The above line keeps repeating over and over in the error log. There are
two things that confuse me...

(1) I never use rollback() anywhere in my code... so I'm wondering if
something else could cause a rollback.

(2) The "(eval 9) line 22" message doesn't tell me what file the error is
in. The modperl guide suggested adding the following lines to my perl
startup file so that the messages would show the correct file:

use Carp ();
local $SIG{__WARN__} = \Carp::cluck;

...but they don't seem to have any effect after I added them and did an
"apachectl restart".

-Philip Mak ([EMAIL PROTECTED])




[OT] Where to download Sablotron for AxKit

2000-12-23 Thread Philip Mak

This is off-topic, but I am having problems downloading Sablotron from its
website (Sablotron is a component that AxKit requires).

On http://www.gingerall.com/charlie-bin/get/webGA/act/download.act the
link for "Sablotron 0.50 - sources" and "Sablotron 0.50 - Linux
binary" redirects to download.gingerall.cz, which is an unknown host.

The Sablotron mailing list also appears to be busted, having an unknown
host problem.

Since several people mentioned AxKit on this list, I thought someone here
might know about Sablotron. Do you know where I can download it from? I
haven't been able to find any mirrors for it.

Thanks,

-Philip Mak ([EMAIL PROTECTED])




Dynamic content that is static

2000-12-22 Thread Philip Mak

Hi everyone,

I have been going over the modperl tuning guide and the suggestions that
people on this list sent me earlier. I've reduced MaxClients to 33 (each
httpd process takes up 3-4% of my memory, so that's how much I can fit
without swapping) so if the web server overloads again, at least it won't
take the machine down with it.

Running a non-modperl apache that proxies to a modperl apache doesn't seem
like it would help much because the vast majority of pages served require
modperl.

I realized something, though: Although the pages on my site are
dynamically generated, they are really static. Their content doesn't
change unless I change the files on the website. (For example,
http://www.animewallpapers.com/wallpapers/ccs.htm depends on header.asp,
footer.asp, series.dat and index.inc. If none of those files change, the
content of ccs.htm remains the same.)

So, it would probably be more efficient if I had a /src directory and a
/html directory. The /src directory could contain my modperl files and a
Makefile that knows the dependencies; when I type "make", it will evaluate
the modperl files and parse them into plain HTML files in the /html
directory.

Does anyone have any suggestions on how to implement this? Is there an
existing tool for doing this? How can I evaluate modperl/Apache::ASP files
from the command line?

Thanks,

-Philip Mak ([EMAIL PROTECTED])






Re: Dynamic content that is static

2000-12-22 Thread Philip Mak

On Fri, 22 Dec 2000, Edward Moon wrote:

  Running a non-modperl apache that proxies to a modperl apache doesn't seem
  like it would help much because the vast majority of pages served require
  modperl.

 Not necessarily.
 
 You can use mod_proxy to cache the dynamically generated pages on the
 lightweight apache.

I thought about this... but I'm not sure how I would tell the lightweight
Apache to refresh its cache when a file gets changed. I suppose I could
graceful restart it, but the other webmasters of the site do not have root
access. (Or is there another way? Is it possible to teach Apache or Squid 
that ccs.htm depends on header.asp, footer.asp, series.dat and index.inc?)

Also, does this mess up the REMOTE_HOST variable, or is Apache smart
enough to replace that with X-Forwarded-For when the forwarded traffic is
being sent from a local priviledged process?

-Philip Mak ([EMAIL PROTECTED])




Re: (Beginner) mod_Perl hosting scarce?

2000-12-18 Thread Philip Mak

On Mon, 18 Dec 2000, Garry Heaton wrote:

 My projects would typically involve small business online databases. If I'm
 going to have trouble finding hosts it might be best to use JSP or ASP.
 What's the current situation on this one?

It is true that less hosts offer mod_perl hosting due to the potential
complexity involved in it. However, they are out there, and you just need
to find one good one.

I believe aplushosting.com and cwihosting.com have mod_perl. You can try
e-mailing their tech support to confirm it. If not, try searching
http://www.ispcheck.com/ for other webhosts.

As for selection of scripting language, mod_perl is probably the most
versatile language. But as you know, it is also one of the harder ones to
learn. If you just want to do simple database sites, PHP or ASP might be a
better choice as it would probably take you less time to learn.

-Philip Mak ([EMAIL PROTECTED])




load average: 24.07, 14.76, 9.20

2000-12-16 Thread Philip Mak

Hi all,

I've been having the following problem with my machine (400MHz, 192 MB
RAM, 8.4 GB SCSI disk):

1:27am  up 3 days,  7:33,  8 users,  load average: 24.07, 14.76, 9.20

Every once in a while, the load average gets up to a very high level (at
this point, programs start getting "Out of memory!" errors, etc.).

I don't really know what to do to fix this, other than typing
/sbin/reboot. Looking at "top" doesn't show any very big processes, so I
suspect it might be being caused by a large number of small processes.

This is a web server; one of the sites I have on it, animewallpapers.com,
is much more popular than all the other sites. Since most of the activity
on the server is from httpd, I'm guessing that this site (which runs
Apache::ASP) is responsible. I can't tell for sure, though.

Could someone point me in the right direction as to how to:

(1) find out what is causing my server to become so slow (perhaps there's
some sort of benchmarking tool I can use?)

(2) fix it (if animewallpapers.com's ASP scripts is causing it, I would
have to figure out how to recode them more efficiently)

Thanks,

-Philip Mak ([EMAIL PROTECTED])




Re: Apache::ASP #include file

2000-08-26 Thread Philip Mak

On Sat, 26 Aug 2000, Michael Robinton wrote:

 apache_ssl and mod_perl co-exist nicely together, try that instead. I've 
 a couple of these in production environments that work very well.

I don't understand... what does SSL have to do with this?

-Philip Mak ([EMAIL PROTECTED])

   Recently, I reinstalled mod_perl and Apache::ASP on my system in order to
   fix a problem I was having with ASP not using the new version of perl on
   my system. However, I'm having problem with some old code.
   
  
  That functionality was never intended to be supported, and 
  am surprised it ever worked!  How painful would it be for
  you to change your includes to be like !--#include file="index.inc"--
  





Should cookies expire?

2000-08-03 Thread Philip Mak

I have a general question about websites that use cookies to store session
information:

Why should they expire at all?

Let me give you an example. Yesterday, I was at Amtrak Rail's website to
purchase train tickets. Now, I multitask a lot, and sometimes I might
leave one browser window idle while I go to do something else.

So I'm browsing the possible rides I can get on, then I do something else
for half an hour. I go back to the browser window with Amtrak, and then
when I click something it tells me that my session has expired and I'll
have to login again!

Gritting my teeth, I login again and start the process over. This time I
finish the reservation and minimize the window.

Later that night, I want to check my reservation again. I maximize that
window and click something ... oops, session expired again!

I realize that in a computer lab environment, automatic session expiration
may be needed for security purposes, but I think in the situation
mentioned above, it was excessive.

What do people think about this?

-Philip Mak ([EMAIL PROTECTED])




Re: how to check for ssl.

2000-08-03 Thread Philip Mak

On Thu, 3 Aug 2000, Stas Bekman wrote:

  use Apache::URI ();
  $r-parsed_uri-scheme;
  
  returns http or https
 
 Not really, you can spoof both:

Does the user have to spoof it deliberately in order for the wrong one to
be detected?

If spoofing requires the user to do it on purpose, then in this case the
$r-parsed_uri-scheme should be sufficient. The other method (putting
HTTPS on a different port and using mod_rewrite to make it transparent) is
better of course, but in case you can't do it for some reason, I think
this will work too.

They don't gain anything by spoofing http/https deliberately; it just
makes their connection not secure.

-Philip Mak ([EMAIL PROTECTED])




Re: Package Lexicals and PerlModule

2000-08-03 Thread Philip Mak

On Sun, 30 Jul 2000, mgraham wrote:

 Normally, I expect that lexical 'my' vars declared at the package
 scope (i.e. at the top of a file), should be visible to subroutines
 declared in the same package, and should maintain their values between
 calls to those subroutines.

If you are running perl v5.6 or later, I think you can use "our" instead
of "my" and it will do what you want it to do.

As for why it acts this way, I'm not sure...perhaps someone else on this
mailing list can shed some light on this issue.

-Philip Mak ([EMAIL PROTECTED])




How to use warnings in Apache::ASP?

2000-07-29 Thread Philip Mak

Is there a way to make it so that all Apache::ASP scripts on my site have
"use warnings;" on by default (something analogous to PerlSetVar Strict 1
and use strict;)? Or do I just have to put "use warnings;" in every file?

I searched the nodeworks.com/asp site (keyword warning/warnings) as well
as the modperl mailing list archive (keyword "warning ASP" and "warnings
ASP") but could not find anything.

-Philip Mak ([EMAIL PROTECTED])




require bug?

2000-07-29 Thread Philip Mak

I noticed that the following bit of code does not work properly:

require 'test.pl';
chdir '..';
require 'test.pl';

Even though the second require is trying to load a different file, perl
thinks that it is the same file and therefore doesn't require it again (I
had to use the 'do' command instead).

Is this a bug, or is it supposed to work like that?

-Philip Mak ([EMAIL PROTECTED])




Re: Human readable flatfiles

2000-06-01 Thread Philip Mak

On Wed, 31 May 2000, Perrin Harkins wrote:

Thanks for the reply. I have a few problems though:

 You need to read up a little on modules and "require" in Perl5.
 
 The quick and dirty solution is to use "do" instead of require.  That will
 solve your immediate problem, but you'll still be reading the files every
 time which might eventually become a performance hit.

I can't seem to get "do" to work. I did this:

my $series_name;
do "series_$series.i"; # -- note include filename depends on a variable
print "$series_name\n";

but $series_name comes out undefined, even though series_$series.i (in
this case, series_ranma.i) sets $series_name.

I also tried the package thing:

 What I do when I have things like config files is make actual unique
 packages of them.  For example:
 
 (In My/Module.pm)
 package My::Module;
 # don't use strict here
 $foo = 7;
 
 (In some other program)
 use My::Module;
 print "$My::Module::foo\n";

How would this work if the include filename has to depend on a variable? I
think I have the parsing wrong; I kept getting error messages. e.g.
something like:

use $series;
print "$$series::series_name\n";

 Honestly though, your example makes it look like you'd be better off with
 dbm files and Tie::MLDBM or something.

The database files would have to be human readable though, such that they
can be edited by a text editor. The data that I am making includes an
index to content that is maintained by other webmasters. The other
webmasters don't even know how to use a UNIX shell, so I have to keep it
simple for them. If I used a binary file format I'd have to make them
learn the tools for changing it.

-Philip Mak ([EMAIL PROTECTED])




Human readable flatfiles

2000-05-31 Thread Philip Mak

Hello,

I would like to ask a question about maintaining human readable flatfiles
in general.

I have a perl (non-modperl) program that needs some input data. Currently,
it reads in the data by "require"ing another perl script that has
statements to set the variables (as global variables). I did it this way
so that I can easily edit the include file if I want to change values,
and I can even put code in the include file to calculate values.

But, I am finding that this does not work in modperl under "use strict".
Apparently, code in a file called by require is not in the same variable
scope. (If "use strict" is off, it works sometimes but other times the
variables come out with the values from the previous invocation.)

Here is an example of what I currently have:

$series = $ENV{'QUERY_STRING'};
require "series_$series.i";
(do something with the variables $series_name, $series_prefix)

and inside series_ranma.i, I have:

$series_name = 'Ranma 1/2';
$series_prefix = 'Ranma';
@series_data = (
insert list here
);

but this does not work, since in modperl the required file has a different
variable scope.

Two ideas I am thinking of are:

(1) make series_$series.i into a text file instead, like this:

Ranma 1/2
Ranma
insert list here

and read it in with fopen(SERIES, "series_$series.i") and SERIES. But
then I wouldn't be able to put perl code inside it and it is less clear.

(2) make series_$series.i into a subroutine that returns its values. I can
require it, and then call the subroutine and get the return values, but
this seems kind of kludgy to me.

Does anyone have a better suggestion for maintaining human readable
flatfiles containing data to be read? Is there a good CPAN module for it
perhaps?

Thanks,

-Philip Mak ([EMAIL PROTECTED])

P.S. When replying to this message, please make sure my e-mail address is
in the to field. I'm not sure if I'm subscribed to this mailing list or
not. It seems to keep kicking me off or something, because I haven't been
receiving any mail from it.




Apache::ASP doesn't initialize variables?

2000-05-26 Thread Philip Mak

I've noticed something peculiar with Apache::ASP. It does not seem to be
initializing variables to 0. That is, if I load one ASP webpage that sets
a variable to X, then in the next ASP webpage the variable is initialized
to X instead of 0. Is this intended behavior, or is it a bug?

Also, I was wondering if there is a way to make Apache::ASP print more
meaningful error messages when a compile error is encountered, instead of
"500 Internal Server Error". Right now I have to look in my HTTP error
log, which gives something like this:

[Fri May 26 05:56:54 2000] [error] [asp] [31714] [error] Bad name after
Arrest' at (eval 31) line 96. -- ,
/usr/lib/perl5/site_perl/5.005/Apache/ASP.pm line 1180
Bareword found where operator expected at (eval 20) line 96, near
"'yuarrest/|You're"
(Missing operator before re?)

This also does not tell me the filename of the script that caused the
problem, so sometimes I have to guess when my script uses #include file.
It would be nice if Apache::ASP would display the error message right
there on the webpage (a la IIS ASP).

-Philip Mak ([EMAIL PROTECTED])




Re: Apache::ASP #include virtual loses variables

2000-05-21 Thread Philip Mak

So, there's no way in Apache::ASP to include a file by specifying a path
relative to DOCUMENT_ROOT, or relative to the directory of the current
file (which is not necessarily equivalent to the request URI, if the
current file is included)?

I managed to get my site to work using !--#include file-- and specifying
full pathnames and using PerlSetVar IncludesDir, but it would have been
nice if there was a way to include a file with relative path
specifications as in the above paragraph (and still be in the same
namespace). I first learned ASP on IIS, and there, !--#include virtual--
(which allows the relative path specifications) can be used for this
purpose.

-Philip Mak ([EMAIL PROTECTED])

On Sat, 20 May 2000, Joshua Chamas wrote:

 Ime Smits wrote:
  
  | Well, I would like to suggest that you consider including !--#include
  | virtual-- in the Apache::ASP distribution, so that included files use the
  | same namespace. It doesn't make sense logically that include virtual
  | behaves differently from include file (other than the way the
  | filename/pathname is interpreted, of course).
  
  It does make sense to me, though. Consider one having very big (say 50k)
  include files being included from several other (say 100) scripts. Just
  sucking them in each script doing the include would cause *every* script
  growing by at least the size of the include. Now as Apache::ASP caches all
  compiled scripts, this would result in each httpd process growing by 50kB x
  100 scripts = 5 MB, holding 98% redundant data.
  
 
 If DynamicIncludes are turned on, then file includes are 
 compiled as subroutines, and executed as if $Response-Include()
 were called.  Without this setting, includes text are added
 to the including scripts like you are saying.
 
 But this does not solve the virtual includes problem.  A virtual
 include is supposed to be anything executed on the server, 
 not just files, but the output from anything like some C cgi, 
 or another .pl or so, and must therefore be processed as a 
 separate subrequest.  This is what Apache::SSI does.
 
 The problem here is that there is no way for the Apache::ASP 
 script to catch the output from the apache subrequest to 
 even try to compile it into its own script, even if you really
 wanted to do this.  So this is why this is stays a separate
 feature to be handled by Apache::SSI, that it doesn't help
 at all to inline it into Apache::ASP, except some small
 performance benefit by not running the output through 
 Apache::Filter
 
 -- Joshua
 _
 Joshua Chamas Chamas Enterprises Inc.
 NodeWorks  free web link monitoring Huntington Beach, CA  USA 
 http://www.nodeworks.com1-714-625-4051
 




Apache::ASP #include file, relative filenames

2000-05-21 Thread Philip Mak

On Sun, 21 May 2000, Joshua Chamas wrote:

  So, there's no way in Apache::ASP to include a file by specifying a path
  relative to DOCUMENT_ROOT, or relative to the directory of the current
  file (which is not necessarily equivalent to the request URI, if the
  current file is included)?

 !--#include file= -- allows relative file specifications.
 Did it not work for you for some reason?

It does not work in this kind of situation:

/series/slayers/lina/index.inc does !--#include file="../index.inc"--
/series/slayers/index.inc does !--#include file="../index.inc"--

If I access http://.../series/slayers/lina/index.inc, then it will do the
first include correctly. But in the second include, it resolves the
path name relative to /series/slayers/lina/ instead of /series/slayers/,
so it ends up including /series/slayers/index.inc instead of
/series/index.inc.

I worked around this by doing "PerlSetVar IncludesDir /home/goamembers/www"
(which is my DOCUMENT_ROOT), and then using includes such as:

!--#include file="/series/slayers/index.inc"--
!--#include file="/series/slayers/lina/index.inc"--

-Philip Mak ([EMAIL PROTECTED])




Apache::ASP #include virtual loses variables

2000-05-20 Thread Philip Mak

Hello,

I have stumbled upon an issue with Apache::ASP !--#include virtual--
directive. Included files do not seem to be able to access the same scope
of variables. I am using the following test program:

File 1.inc:

!--#include virtual="2.inc"--
% $test .= '1'; %
p$test = %=$test%/p

File 2.inc:

% $test = '2'; %

One would expect the output to be "$test = 21", but it comes out as 
"$test = 1". I have tried the same thing with #include file instead of
#include virtual and the result is correct.

I am using Apache/1.3.9 on Red Hat Linux. My httpd.conf is setup for ASP
as follows:

Files ~ (\.inc)
SetHandler perl-script
PerlSetVar Global /tmp
PerlSetVar Filter On
PerlHandler Apache::ASP Apache::SSI
/Files

Does anyone know if this is a bug, or a feature, or did I perhaps setup
ASP incorrectly? Is there a good workaround for this? (It is inconvenient
for me to use #include file instead since I need to include files relevant
to DOCUMENT_ROOT, as well as relevant to the location of the current file,
but I could use that as a last resort.)

Thanks,

-Philip Mak ([EMAIL PROTECTED])




Re: Apache::ASP #include virtual loses variables

2000-05-20 Thread Philip Mak

On Sat, 20 May 2000, Joshua Chamas wrote:

 Use file includes.  virtual includes are meant to execute
 anything and include its output, and is handles by Apache::SSI
 outside of Apache::ASP. File includes will be executed as perl 
 asp subroutines in the same perl namespace as the 
 including script.

I see. There are two problems that I have with file includes though:

(1) I cannot specify a file's location relative to $ENV{'DOCUMENT_ROOT'}.

(2) I cannot specify a file's location relative to the directory the
current file is in.

For #1, I want to do something like this in all my pages:

!--#include virtual="/code/header.asp"--
!--#include virtual="/code/footer.asp"--

And for #2, I have an "index.inc" in all my directories. Each index.inc
has to include the one in its parent directory, e.g.:

!--#include file="../index.inc"--

so that directories can pass on properties to their subdirectories. If I
use include file for that, it will include files relative to the pathname
of the first ASP file, and not relative to the pathname of the ASP file
that actually has the include.

One reason I'm coding like this is because I want to give each directory a
title and append it to the page's title. e.g.:

http://www.girlsofanime.com/series/slayers/lina/ has title of
Girls of Anime::Anime Series::The Slayers::Lina Inverse

The way the title is constructed is:

In /index.inc: $title = 'Girls of Anime';
/series/index.inc: $title .= '::Anime Series';
/series/slayers/index.inc: $title .= '::The Slayers';
/series/slayers/lina/index.inc: $title .= '::Lina Inverse';

Is there a better way I can do this? Right now I'm thinking of either
trying to hack Apache::ASP to support #include virtual, or using absolute
pathnames or trying to put $ENV{'DOCUMENT_ROOT'} in the file path.

-Philip Mak ([EMAIL PROTECTED])




Re: Apache::ASP #include virtual loses variables

2000-05-20 Thread Philip Mak

On Sat, 20 May 2000, Joshua Chamas wrote:

  !--#include virtual="/code/header.asp"--
  !--#include virtual="/code/footer.asp"--
 
 For #1, know includes will be picked up from your Global directory,
 so you can use that repository to share includes, instead of some
 DOCUMENT_ROOT location.  You can also use IncludesDir for this if
 it is set.

Thanks! That pretty much lets me do exactly what I want to.

  And for #2, I have an "index.inc" in all my directories. Each index.inc
  has to include the one in its parent directory, e.g.:
  
  !--#include file="../index.inc"--
  
  so that directories can pass on properties to their subdirectories. If I
  use include file for that, it will include files relative to the pathname
  of the first ASP file, and not relative to the pathname of the ASP file
  that actually has the include.
  
  ...
  http://www.girlsofanime.com/series/slayers/lina/ has title of
  Girls of Anime::Anime Series::The Slayers::Lina Inverse
  
  The way the title is constructed is:
  
  In /index.inc: $title = 'Girls of Anime';
  /series/index.inc: $title .= '::Anime Series';
  /series/slayers/index.inc: $title .= '::The Slayers';
  /series/slayers/lina/index.inc: $title .= '::Lina Inverse';
  
 
 I would not do it this way, in fact the way I would do this
 would not be with your methods at all, unless you want 
 to have each section to be arbitrarily different and 
 maintained by separate graphics designers.  The way I would
 do this thing is to lose the directory structure completely
 and to have things be database driven with parameters from
 ?query_string like /index.asp?dir=, which you can build
 the title for from the database because you know all the 
 parents for dir=.

There are two reasons why I don't like doing it this way:

(1) The URL is no longer human readable. My site will have a clear
hierarchical structure, so I think it makes sense to mirror that in the
directories. People who want to chop or type URLs (even though I have good
navigation in my web design) can do so, and they can also look at the URL
to get an idea of where they are.

(2) Setting up a database seems to be overkill for this site. The only
meta data that it has (or probably will ever have) is:

- directory name label
- What links are on the sidebar for this directory?
- What advertising banner(s) is displayed on pages in this directory?

My method of having an index.inc for each directory is fairly simple, yet
it would seem to provide all the flexibility I need to implement this and
the flatfiles are easy to maintain. I don't see a good reason to switch to
database-based which seems to be significantly more complicated.

 The point here is that each ASP script is a whole program
 by itself, and I would not recommend having hundreds or
 thousands of them to have to compile for your site.  If you
 have meta data you want to display, you should really stick
 as much of it as possible in a database like MySQL.  In
 the long run, your project will be much more maintainable
 even if in the short run its easier to derive info from
 unix directories  flat files.

Each ASP script is compiled separately? I thought that !--#include
file-- and !--#include virtual-- are supposed to work just like
#include does in C, i.e. it pretends that the text of the included file
was actually pasted directly into the program. Am I thinking about ASP
include in the wrong way?

 If you want to have a nicer /path_info scheme, we'll 
 probably have to add a patch for you to have Apache::ASP
 not be bound to executing real files as it is currently.
 This would be more similar to the way Mason does things.

Well, I would like to suggest that you consider including !--#include
virtual-- in the Apache::ASP distribution, so that included files use the
same namespace. It doesn't make sense logically that include virtual
behaves differently from include file (other than the way the
filename/pathname is interpreted, of course).

-Philip Mak ([EMAIL PROTECTED])





Re: cgiwrap for Apache::ASP?

2000-04-16 Thread Philip Mak

On Fri, 14 Apr 2000, Ime Smits wrote:

 | I also have ASP installed, and I'd like to be able to transparently suid
 | the .asp scripts too. Do you know how I could go about doing this?
 
 I think this is a general bad idea. The only purpose of running scripts via
 a suexec or setuid mechanism I can think of is to stop different users 
 websites running an the same httpd digging and interfering in each other's
 data and files.

This server is used by many unaffiliated people who run their own
websites. Some people want to write their own CGI or ASP scripts that work
with files. The simplest example is a form that can be filled out and
stores the data in a file. If I don't suid their scripts, then they can
mess up each others' data files. They also cannot write data files into
their own directories.

Also, my system has cgiexec (does suid for CGI scripts) installed. The
cgiexec documentation says that once cgiexec is installed, it is a
security risk if people can execute code as "nobody" since that user has
special access to the cgiexec code. Right now, anyone can execute code as
nobody by writing ASP code, so in essence I have a security hole in my
system, and I DO need cgiexec.

So, does anyone have suggestions on how to do suid for ASP scripts?

 If you're not trusting the people making websites and you're looking for a
 virtual hosting solution, I think some postings earlier this week about

That's exactly the case here.

 proxying requests to a user-dedicated apache listening on localhost is the
 best solution.

Wouldn't this require running one web server process for each user? I may
be wrong, but it seems to be simpler to just suid their scripts.

-Philip Mak ([EMAIL PROTECTED])




cgiwrap for Apache::ASP?

2000-04-14 Thread Philip Mak

Hello,

I searched the egroups.com mod_perl archive for "cgiwrap" and didn't find
anything relevant to ASP.

I'm wondering if there's any documentation about how to use cgiwrap with
the ASP extension. Currently I have a modified version of cgiwrap
installed on my system such that all .cgi/.pl files are transparently
(i.e. no need to put cgiwrap in the URL) suid'ed to the script owner
before being executed.

I also have ASP installed, and I'd like to be able to transparently suid
the .asp scripts too. Do you know how I could go about doing this?

Thanks,

-Philip Mak ([EMAIL PROTECTED])