RE: [OT] Re: Vhosts + mod_perl

2001-12-03 Thread Jonathan M. Hollin

:: Everytime I make a new document root for a different website, say the
:: subdomain loco on trains.ath.cx, do I need to update the DNS?
::
:: For foo.trains.ath.cx and bar.trains.ath.cx do I need to make new DNS
:: entries for foo and bar subdomains? Surely I don't? Because browsers
:: will ask for it, and get directed to trains.ath.cx, and my apache will
:: take are of the rest from the HTTP 1.1 Host: field?

If you are adding sub-domains then, of course, you need to update your DNS.
Without a corresponding DNS entry, how could foo.trains.ath.cx ever be
resolved?

Jonathan M. Hollin - WYPUG Co-ordinator
West Yorkshire Perl User Group
http://wypug.pm.org/




Re: [ANNOUNCE] OpenFrame 2.05

2001-12-03 Thread Leon Brocard

Jonas Liljegren sent the following bits through the ether:

 I checked it out from CVS.  But there is no INSTALL file.

Right, there is however a lib/OpenFrame/Install.pod which I've just
updated to clear things up. I guess we should copy the text to INSTALL
to make things clear. The Apache stuff still hasn't been fleshed out.
 
 I guess I should do some Apache configuration or run a custom HTTP
 server.  But I can't find anything about it in README or FAQ.

You can run the examples (which use the standalone HTTP server) after
making install which should give you a good idea of how things work.
 
 What will happen if I do a make install?

The Perl modules get installed. Nothing else. It's just an application
framework: you have to build the rest. However, it isn't hard. For
example, go into examples/hangman2/ and run ./hangman2 for a full
(fun!) example using sessions and TT2.

 But I'm still havenn't a general picture of the principle design.

Right. I guess OpenFrame.pm should contain a top-level description of
how it all works. I'm going to turn the examples into walk-throughs as
that'll help too I imagine.
 
 It feels like too much module documentation to read before I have
 decided if I would liek to realy try it out.

It's not *that* much work:

  % cvs update
  % perldoc lib/OpenFrame/Install.pod
  % make install
  % cd examples/hangman2/
  % ./hangman.pl

Try it out and tell us what you think ;-)

Leon

ps thanks for you comments
--
.sig.at.home



Re: [OT] Re: Vhosts + mod_perl

2001-12-03 Thread Dave Baker


--82I3+IH0IqGh5yIs
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
Content-Transfer-Encoding: quoted-printable

On Sun, Dec 02, 2001 at 04:19:28PM -, Jonathan M. Hollin wrote:
 :: Everytime I make a new document root for a different website, say the
 :: subdomain loco on trains.ath.cx, do I need to update the DNS?
 ::
 :: For foo.trains.ath.cx and bar.trains.ath.cx do I need to make new DNS
 :: entries for foo and bar subdomains? Surely I don't? Because browsers
 :: will ask for it, and get directed to trains.ath.cx, and my apache will
 :: take are of the rest from the HTTP 1.1 Host: field?
=20
 If you are adding sub-domains then, of course, you need to update your DN=
S.
 Without a corresponding DNS entry, how could foo.trains.ath.cx ever be
 resolved?


Actually, it's possible to create a wildcard domain so that
*.trains.ath.cx will resolve identically. =20

If you choose to do this you no longer have to update DNS for new hosts
but run the risk of people getting inappropriate errors when using a
hostname that shouldn't exist.  Instead of getting 'no such host' they'll
end up with a valid IP and will hit your web server which then has to
decide how to handle it.

As an example, your zone file can look just as follows - I've split www
and vhosts onto separate IP addresses but this isn't required, as you can
mix named and wildcard entries fairly freely.  If you use tools such as
dnslint you may need to append a comment on the end of your wildcard line
to indicate to dnslint that the wildcard is intentional.

(header fluff)

;; 'real' host
www   IN  A   123.123.123.1
vhost IN  A   123.123.123.2

ftp   CNAME   www.trains.ath.cx.

;; everything else.
* CNAME   vhost.trains.ath.cx.

(footer fluff)


Dave

--=20

- Dave Baker  :  [EMAIL PROTECTED]  :  [EMAIL PROTECTED]  :  http://dsb3.com/ -
GnuPG:  1024D/D7BCA55D / 09CD D148 57DE 711E 6708  B772 0DD4 51D5 D7BC A55D


--82I3+IH0IqGh5yIs
Content-Type: application/pgp-signature
Content-Disposition: inline

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org

iD8DBQE8C5mSDdRR1de8pV0RAuEdAJ0bbe0lT0ydp1dVyMNLOKhTLYiK3QCgtEdx
IJx5OkyMuKKWDjOYq4UMKiM=
=DBLt
-END PGP SIGNATURE-

--82I3+IH0IqGh5yIs--



too many files open

2001-12-03 Thread John Michael



HiI usethe perl sub below to create 
thumbs with image magick on my mod perl server andnever really had any 
problems with it. I put it on another server and aftera short while it 
froze up the server with this error.too many files open.

It is running under a mod-perl environment because 
in doing so I am also able to rename images at the same time.
I know that because it is running under a mod perl 
environment, the lifetime of the script is until roboot of server orrestart 
of apache.
The freezehappend on one of the lines in 
this sub so I'm thinking I'm not doingsomething right and leaving files 
open.I don't think this is a result of the problem with the server.I 
would appreciate if someone would look at it and tell me if it is ok or not or 
give me some suggestions to prevent this problem in the future so that I can 
continue to run it in mod-perl.

Perlmagick claims that it handles all of the file 
handling.
sub is called like so:# Call to build 
the thumbs hereif 
($build_thumbs){build_thumbs($folder,$working_dir,$thumb_pattern);}#sub 
build_thumbs {my ($folder,$base_dir,$thumb_pattern) = @_;my $current_dir 
= cwd;my $working_dir = join("/",$base_dir, $folder);print 
qq~pBuilding Thumbs in Folder: $folder/p\n~;chdir 
($working_dir) || die "cannot cd $working_dir ($!)";my @image_array = 
glob("*.jpg");my @images = ();foreach (@image_array) 
{ unless ( $_ =~ /$thumb_pattern/i ) 
{ push(@images, $_);} }my $pic;my 
$num_thumbs = @images;foreach $pic (@images){ my 
$image = Image::Magick-new(magick='JPEG'); my 
$file = join("/",$working_dir, $pic); open(DATA, "$file") 
|| die "$file $!"; 
$image-Read(file=DATA); close(DATA);## 
MAKE THUMB HERE ##my ($width, $height, $size, $format) = 
split(/,/,$image-Ping($file));my $rotated = 0;if ($width  
$height){ 
$image-Rotate(degrees='90'); $rotated = 
1;}$image-Resize(geometry='55',filter='Gaussian', 
blur='.5');if ($rotated){ 
$image-Rotate(degrees='-90');}## END MAKE THUMB HERE 
##my $out_file = join("/",$working_dir, 
"$thumb_pattern$pic");open(DATA, "$out_file"); 
$image-Write(file=DATA,filename=$out_file);close(DATA);undef 
@$image;}undef $image;chdir ($current_dir);print 
qq~pBuilt b$num_thumbs/b Thumbs in Folder: 
$folder/p\n~;}###THanks 
in advanceJohn Michael


[ANNOUNCE] ApacheBench 0.62 released

2001-12-03 Thread Adi Fairbank

In my ongoing effort to bring the ApacheBench Perl module up to date with
the ab distributed with Apache, here is another release.  This one mainly
incorporates

 * support for HTTP Keep-Alive feature,
 * support for HTTP HEAD requests,
 * global and per-run time limits,
 * accurate tallying of sent, good, and failed requests (previously these
were fudged)
 * a few small bug fixes.

Please see the Changes file for complete details.

Happy benchmarking!

-Adi




Multiple Sites

2001-12-03 Thread Purcell, Scott

Hello,
I have the need to create 10 web sites off my Apache web server. I do not
want to use 10 IP addresses. So I am doing to cheeze and do a 
URL/directory/index.html foreach site.


Then I would give each customer a URL of URL/directory and I would like the
index.html or the default.html to come up. But it does not. 
I edited my conf file to this
IfModule mod_dir.c
#DirectoryIndex index.html
DirectoryIndex default.htm

/IfModule

But but it does not work. But if I put in URL/directory and a forward slash/
eg. http://URL/directory/ then it shows the default.htm page. But I know my
customers, and they will not put in the directory forward slash. How do I
get around this issue?

Thanks


Scott Purcell




RE: Multiple Sites

2001-12-03 Thread Jonathan M. Hollin

:: But but it does not work. But if I put in URL/directory and a
:: forward slash/
:: eg. http://URL/directory/ then it shows the default.htm page.
:: But I know my
:: customers, and they will not put in the directory forward slash. How do I
:: get around this issue?

Scott,

The best solution is to use sub-domains, if you are able to effect changes
to your DNS.  That way your customers can't fail.

e.g:  http://cust1.yoursite.com/
http://cust2.yoursite.com/

etc.

Jonathan M. Hollin - WYPUG Co-ordinator
West Yorkshire Perl User Group
http://wypug.pm.org/




Re: Multiple Sites

2001-12-03 Thread Ronald Beck

Try setting your directory index like this...

DirectoryIndex default.htm index.htm default.html index.html

this should catch any one of the four files as the index when you enter
http://URL/directory.  
Also, make sure directory is in your http root path.  See the
httpd.conf file if you're not sure what your root directory is.

Ron

Purcell, Scott wrote:
 
 Hello,
 I have the need to create 10 web sites off my Apache web server. I do not
 want to use 10 IP addresses. So I am doing to cheeze and do a
 URL/directory/index.html foreach site.
 
 Then I would give each customer a URL of URL/directory and I would like the
 index.html or the default.html to come up. But it does not.
 I edited my conf file to this
 IfModule mod_dir.c
 #DirectoryIndex index.html
 DirectoryIndex default.htm
 
 /IfModule
 
 But but it does not work. But if I put in URL/directory and a forward slash/
 eg. http://URL/directory/ then it shows the default.htm page. But I know my
 customers, and they will not put in the directory forward slash. How do I
 get around this issue?
 
 Thanks
 
 Scott Purcell



Re: Multiple Sites

2001-12-03 Thread Andrew Ho

Hello,

SPBut if I put in URL/directory and a forward slash/ eg.
SPhttp://URL/directory/ then it shows the default.htm page. But I know my
SPcustomers, and they will not put in the directory forward slash. How do
SPI get around this issue?

RedirectMatch permanent ^/directory$ http://URL/directory/

Will do you what you want.

Humbly,

Andrew

--
Andrew Ho   http://www.tellme.com/   [EMAIL PROTECTED]
Engineer   [EMAIL PROTECTED]  Voice 650-930-9062
Tellme Networks, Inc.   1-800-555-TELLFax 650-930-9101
--




delayed file uploads...

2001-12-03 Thread El Capitan

i have a simple question.  im not sure if there is a mod_perl directive or
module for this but id like to perform this simple task:

two web pages run in sequence. the first page, id like a user to select
several files from his/her machine for uploading to the server using the
input type file tag.  when the user requests the next page, i DO NOT want
the files to immediately transfer (many reasons for this, just a hard
requirement).  rather, id like to store the names of the files into a cookie
or save it on the server with apache::session.  this part I already have
working, and am storing the file names into a tied hash using
apache::session module.

then the user moves onto the second web page and populates more form fields
with additional information.  upon submitting this second form, the files
from the previous web page (names stored in the hash %session from
apache::session) are then sent to the server.

im not sure whether or not this can be done, anyone have any clues?



Kirk




[OT] mod_gzip configuration

2001-12-03 Thread Jonathan M. Hollin

Hey gang,

I have just installed mod_gzip on my mod_perl server and am very impressed
with its performance.  However, I've noticed that it is only compressing
mod_perl output when parameters are passed to the script in the URL.

E.g:  http://www.mysite.com/test.cgi?test=1 will be compressed, but
http://www.mysite.com/test.cgi will not!  :-(

My mod_gzip config follows:

## BEGIN MOD_GZIP CONFIGURATION

LoadModule gzip_module modules/ApacheModuleGzip.dll
mod_gzip_on Yes
mod_gzip_minimum_file_size  300
mod_gzip_maximum_file_size  0
mod_gzip_maximum_inmem_size 10
mod_gzip_keep_workfiles No
mod_gzip_temp_dir   E:/Apache/temp
mod_gzip_item_include   file \.html$
mod_gzip_item_include   file \.pl$
mod_gzip_item_include   file \.cgi
mod_gzip_item_include   mime ^text/.*
mod_gzip_item_include   mime ^httpd/unix-directory$
mod_gzip_item_include   handler ^perl-script$
mod_gzip_item_include   handler ^server-status$
mod_gzip_item_include   handler ^server-info$
mod_gzip_item_exclude   file \.css$
mod_gzip_item_exclude   file \.js$
mod_gzip_item_exclude   mime ^image/.*
mod_gzip_dechunkYes

## END MOD_GZIP CONFIGURATION

Can anyone advise me as to how to change this configuration so that
compression will work with or without parameters?

Thanks in advance.

Jonathan M. Hollin - WYPUG Co-ordinator
West Yorkshire Perl User Group
http://wypug.pm.org/




Re: delayed file uploads...

2001-12-03 Thread Robert Landrum

At 3:29 PM -0800 12/3/01, El Capitan wrote:
i have a simple question.  im not sure if there is a mod_perl directive or
module for this but id like to perform this simple task:

two web pages run in sequence. the first page, id like a user to select
several files from his/her machine for uploading to the server using the
input type file tag.  when the user requests the next page, i DO NOT want
the files to immediately transfer (many reasons for this, just a hard
requirement).  rather, id like to store the names of the files into a cookie
or save it on the server with apache::session.  this part I already have
working, and am storing the file names into a tied hash using
apache::session module.

then the user moves onto the second web page and populates more form fields
with additional information.  upon submitting this second form, the files
from the previous web page (names stored in the hash %session from
apache::session) are then sent to the server.

im not sure whether or not this can be done, anyone have any clues?


Sorry... Not possible.  Once you've got the file fields, you can't 
force the browser to upload the file on a different page.  That would 
be a huge security hole.  Just image a

input type=hidden_file name=filetoupload value=/etc/passwd

One suggestion would be to popup a window with the file fields.  Then 
let the user select the files, click submit on the parent window, 
then on the subsequent page, have the parent window issue a 
child.form.submit or something similar with javascript...

Rob

--
Only two things are infinite: The universe, and human stupidity. And I'm not
sure about the former. --Albert Einstein



Re: delayed file uploads...

2001-12-03 Thread David Young

I'd say no. Uploading the file is a function of the browser and not under
your control.

 From: El Capitan [EMAIL PROTECTED]
 Reply-To: [EMAIL PROTECTED]
 Date: Mon, 3 Dec 2001 15:29:08 -0800
 To: [EMAIL PROTECTED]
 Subject: delayed file uploads...
 
 i have a simple question.  im not sure if there is a mod_perl directive or
 module for this but id like to perform this simple task:
 
 two web pages run in sequence. the first page, id like a user to select
 several files from his/her machine for uploading to the server using the
 input type file tag.  when the user requests the next page, i DO NOT want
 the files to immediately transfer (many reasons for this, just a hard
 requirement).  rather, id like to store the names of the files into a cookie
 or save it on the server with apache::session.  this part I already have
 working, and am storing the file names into a tied hash using
 apache::session module.
 
 then the user moves onto the second web page and populates more form fields
 with additional information.  upon submitting this second form, the files
 from the previous web page (names stored in the hash %session from
 apache::session) are then sent to the server.
 
 im not sure whether or not this can be done, anyone have any clues?
 
 
 
 Kirk
 
 




RE: delayed file uploads...

2001-12-03 Thread Chui G. Tey

Here is a hackish option:

place the input type=FILE field in a 
separate frame using IFRAME or FRAME. 

then:
   a.Use Javascript to set the onSubmit action to send the filename.
   b.Use Javasciprt to set client side cookie


   



Re: too many files open

2001-12-03 Thread ___cliff rayman___

are you sure the new system's OS has a large enough
open files parameter?

on linux this could be modified by the parameters
/proc/sys/fs/file-max
/proc/sys/fs/inode-max

it am sure it is different on bsd, solaris etc..

John Michael wrote:

 Hi
 I use the perl sub below to create thumbs with image magick on my mod perl server and
 never really had any problems with it.  I put it on another server and after
 a short while it froze up the server with this error.
 too many files open.

--
___cliff [EMAIL PROTECTED]http://www.genwax.com/





Re: too many files open

2001-12-03 Thread Marius Feraru

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1


Today at 10:14, 'John Michael' wrote:
|JM|Perlmagick claims that it handles all of the file handling.
And it really does it. So why don't you let him do its job!!?!
... meaning, why do you use things like:
|JM|my $image = Image::Magick-new(magick='JPEG');
|JM|my $file = join(/,$working_dir, $pic);
|JM|open(DATA, $file) || die $file $!;
|JM|$image-Read(file=DATA);
|JM|close(DATA);
and
|JM|open(DATA, $out_file);
|JM|$image-Write(file=DATA,filename=$out_file);
|JM|close(DATA);
instead of i.e.:
my $image = Image::Magick-new;
my $ret = $image-Read($file);
return $ret if $ret;
and
$ret = $image-Write($out_file);
return $ret if $ret;
?!

Yet, there is no obvious error in your code that should drive you system
crazy. So, there are some suggestions (in addition to the questioned one
above :) that could make your code behave nicer:
- - do check error codes returned by PerlMagick!
- - globing? use it just at home ;-] in fact enhance your program using some 
MIME module to detect properly the file type + check out for files like 
'foo.jpg' that in fact is an animated GIF file so you will get an entire 
group of output files instead of one etc...
- - you will surely gain lots of cpu time if avoiding that rotation stuff***
- - use File::Spec for constructing file paths in a portable way
- - you may wanna use some thumbmakeronly module :)) : Image::GD::Thumbnail
if that gaussian blur it's not so important ;-]


*** as I see it you need some thumbnails size normalization so: how to avoid
image rotation and re-rotation?
example:

my ($w,$h) = calcGeometry($image-Get('width','height'), 50, 50);
$ret = $image-Scale( width = $w, height = $h);
warn $ret if $ret;

sub calcGeometry {
my ($w,$h,$stdW,$stdH) = @_;
if($w$h) {
$h = $h/($w/$stdW);
$w = $stdW;
} else {
$w = $w/($h/$stdH);
$h = $stdH;
}
return int($w), int($h);
}


- -- 
Marius Feraru http://altblue.n0i.net/
its easy to stop using Perl: I do it after every project





-BEGIN PGP SIGNATURE-

iD8DBQE8DAVdn0ZKufYp8iURAkCWAJ918+VPfGN6ciLx+FnZeZATds4TcwCeI2bx
tzjzTSoz4J/DSiez3rl5xSc=
=KfuI
-END PGP SIGNATURE-




Re: Multiple Sites

2001-12-03 Thread Mithun Bhattacharya

But but it does not work. But if I put in URL/directory and a forward slash/
eg. http://URL/directory/ then it shows the default.htm page. But I know my
customers, and they will not put in the directory forward slash. How do I
get around this issue?


http://httpd.apache.org/docs-2.0/mod/mod_dir.html

quote
This module provides for trailing slash redirects and serving 
directory index files.
/quote

quote
A trailing slash redirect is issued when the server receives a request 
for a URL http://servername/foo/dirname where dirname is a directory. 
Directories require a trailing slash, so mod_dir issues a redirect to 
http://servername/foo/dirname/.
/quote




Re: Multiple Sites

2001-12-03 Thread Medi Montaseri


If you only have one IP and want to have many web sites (ie URLs) for
your customers, then why don't you use VirtualHost. Then your customers
can either have

www.customer1.xyz.com
www.customer2.xyz.com

or 

www.customer1.com
www.customer2.com

solve the problem at the root, not at the leaf...
Looks like you are trying to do what Apache already does...

Unless I missed your point
 
On Tue, 4 Dec 2001, Mithun Bhattacharya wrote:

 But but it does not work. But if I put in URL/directory and a forward slash/
 eg. http://URL/directory/ then it shows the default.htm page. But I know my
 customers, and they will not put in the directory forward slash. How do I
 get around this issue?
 
 
 http://httpd.apache.org/docs-2.0/mod/mod_dir.html
 
 quote
 This module provides for trailing slash redirects and serving 
 directory index files.
 /quote
 
 quote
 A trailing slash redirect is issued when the server receives a request 
 for a URL http://servername/foo/dirname where dirname is a directory. 
 Directories require a trailing slash, so mod_dir issues a redirect to 
 http://servername/foo/dirname/.
 /quote
 
 

-- 
-
Medi Montaseri   [EMAIL PROTECTED]
Unix Distributed Systems EngineerHTTP://www.CyberShell.com
CyberShell Engineering
-




Re: Multiple Sites

2001-12-03 Thread Mithun Bhattacharya

Medi Montaseri wrote:

 If you only have one IP and want to have many web sites (ie URLs) for
 your customers, then why don't you use VirtualHost. Then your customers
 can either have
 
 www.customer1.xyz.com
 www.customer2.xyz.com
 
 or 
 
 www.customer1.com
 www.customer2.com
 



The originator of this thread didnt exactly say he had control over his 
DNS.




RE: Multiple Sites

2001-12-03 Thread Andy Sharp

From the start o' the thread:

 But if I put in URL/directory and a forward slash/
 eg. http://URL/directory/ then it shows the default.htm page. But I
know my
 customers, and they will not put in the directory forward slash. How
do I
 get around this issue?

This isn't really a mod_perl issue,  it's common to all of apache.

Even though this isn't in the scope of the list, here's your problem and
answer.

When you request a file from the apache web server.  URL/something
and something doesn't exist AND a directory exists under the same name,
httpd sends the client a redirect to SERVER_NAME/something/  thus
removing the need for people to type the trailing slash.  (the server
figured out if you need it, and adds it if neccesary, magic eh?)

What you need to do is ensure that the ServerName directive in
httpd.conf is indeed resolvable, because that's what the client's going
to be looking for whenever httpd needs to redirect the client to itself.
Sometimes people use the IP address  (ugy imho),  typically I use the
domainname without the www, just because I hate typing. 

As others have aluded to,  if you're trying to serve multiple domains
(or hostnames) off one IP, you use a system called software virtual
hosting.  HTTP/1.1 Supports the Host: field in the http header to
resolve to the site domain.

Here's the config for the truly lazy  (at least it worked for me)

NameVirtualHost  IP.address.goes.here

VirtualHost IP.address.goes.here
  ServerAdmin  
  DocumentRoot 
  ServerName  This is the part that's causing the redirect problem above
  ErrorLog 
  CustomLog ...
  ErrorDocument ...
  ...  Aliases 
  ...  ProxyPasses 
  ...  Any other config oddities ...
/VirtualHost

Of course all of this is in the httpd guide
http://httpd.apache.org/docs/

Search @
http://search.apache.org/docs/


-A


 -Original Message- 
  If you only have one IP and want to have many web sites (ie 
 URLs) for 
  your customers, then why don't you use VirtualHost. Then your 
  customers can either have
  
[snip]
  
  www.customer1.com
  www.customer2.com
  
 The originator of this thread didnt exactly say he had 
 control over his 
 DNS.




Re: Multiple Sites

2001-12-03 Thread Medi Montaseri


DNS hosting is about $3/monthif your DNS admin does not allow that,
simply move on to the next one...

On Tue, 4 Dec 2001, Mithun Bhattacharya wrote:

 Medi Montaseri wrote:
 
  If you only have one IP and want to have many web sites (ie URLs) for
  your customers, then why don't you use VirtualHost. Then your customers
  can either have
  
  www.customer1.xyz.com
  www.customer2.xyz.com
  
  or 
  
  www.customer1.com
  www.customer2.com
  
 
 
 
 The originator of this thread didnt exactly say he had control over his 
 DNS.
 
 

-- 
-
Medi Montaseri   [EMAIL PROTECTED]
Unix Distributed Systems EngineerHTTP://www.CyberShell.com
CyberShell Engineering
-