[ANNOUNCE] HTTP-WebTest 2.04

2003-09-05 Thread Ilya Martynov
The URL

http://martynov.org/tgz/HTTP-WebTest-2.04.tar.gz

has entered CPAN as

  file: $CPAN/authors/id/I/IL/ILYAM/HTTP-WebTest-2.04.tar.gz
  size: 90381 bytes
   md5: 16bfb8e76bf301e788241d774cab7cee


NAME
HTTP::WebTest - Testing static and dynamic web content

DESCRIPTION

This module runs tests on remote URLs containing
Perl/JSP/HTML/JavaScript/etc. and generates a detailed test report. This
module can be used as-is or its functionality can be extended using
plugins. Plugins can define test types and provide additional report
capabilities. This module comes with a set of default plugins but can be
easily extended with third party plugins.


CHANGES SINCE 2.03:

BUG FIXES:

* ReportPlugin.pm had a bug that it sended out email report even if
mail parameter was set to errors and all tests passed.  Thanks to
Naoki Shima for a patch.


-- 
Ilya Martynov,  [EMAIL PROTECTED]
CTO IPonWEB (UK) Ltd
Quality Perl Programming and Unix Support
UK managed @ offshore prices - http://www.iponweb.net
Personal website - http://martynov.org



-- 
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html



RE: HTTP POST: parameters empty when using ModPerl::Registry(okay when using ModPerl:PerlRun)...

2003-08-14 Thread Steve Bannerman
Perrin,

Thanks for your response...my replies below:
--
   Steve Bannerman
   [EMAIL PROTECTED]
   44.(0)1865.273866


 -Original Message-
 From: Perrin Harkins [mailto:[EMAIL PROTECTED]
 Sent: 06 August 2003 20:40
 To: [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: RE: HTTP POST: parameters empty when using
 ModPerl::Registry(okay when using ModPerl:PerlRun)...

 ...snip...

 I believe I see the source of your troubles in the code that you
 posted.  You are creating a closure by using a lexical variable and then
 accessing it from within a sub.  This is a no-no with any long-running
 system like mod_perl.  You can get away with it in a standard CGI
 environment (or PerlRun) because it just exits after each request
 instead of running the same code again.

 Here is the offending section:

 my $cgi = new CGI;
 saveFile();

 sub saveFile {
   my $inputfile = $cgi-param('file');
 ... etc ...
 }

 Change it to this:

 my $cgi = new CGI;
 saveFile($cgi);

 sub saveFile {
   my $cgi = shift;
   my $inputfile = $cgi-param('file');
 ... etc ...
 }

 I think that will do it.

You're correct...that made it work.

So with respect to your explanation about the long running perl system, am
I to understand that the old version of the saveFile() subroutine uses a
reference to a different $cgi instance that the $cgi instance in the main
body of the script?

As I said, I'm new to perl but that seems to be an awfully strange behavior
of the language...if true, shouldn't the compilation (of the subroutine)
fail because it references an undeclared variable ($cgi)?

Cheers


 - Perrin



RE: HTTP POST: parameters empty when usingModPerl::Registry(okay when using ModPerl:PerlRun)...

2003-08-14 Thread Steve Bannerman
Perrin,

Thanks...your explanation makes sense.

I was thinking of the subroutine as a method on a class and that the objects
in the class had a cgi instance associated with them.  I was thinking in the
object paradigm rather than in the procedural paradigm.

Cheers
--
   Steve Bannerman
   [EMAIL PROTECTED]
   44.(0)1865.273866


 -Original Message-
 From: Perrin Harkins [mailto:[EMAIL PROTECTED]
 Sent: 07 August 2003 19:10
 To: [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: RE: HTTP POST: parameters empty when
 usingModPerl::Registry(okay when using ModPerl:PerlRun)...


 On Thu, 2003-08-07 at 03:36, Steve Bannerman wrote:
  So with respect to your explanation about the long running
 perl system, am
  I to understand that the old version of the saveFile() subroutine uses a
  reference to a different $cgi instance that the $cgi instance
 in the main
  body of the script?

 It uses a reference to the $cgi variable that was in scope when
 saveFile() was compiled.

  As I said, I'm new to perl but that seems to be an awfully
 strange behavior
  of the language...if true, shouldn't the compilation (of the subroutine)
  fail because it references an undeclared variable ($cgi)?

 But it doesn't reference an undeclared variable; it references the
 original $cgi that was available when the sub was compiled.

 Closures are a feature of Perl.  You can read about them in general in
 perlfaq7 and the perlref man page:
 http://www.perldoc.com/perl5.8.0/pod/perlfaq7.html#What's-a-closure-
 http://www.perldoc.com/perl5.8.0/pod/perlref.html

 Note that those both talk a lot about anonymous subs, but any sub can be
 a closure if it refers to a lexical variable defined in an enclosing
 scope.

 There is some mod_perl specific stuff on this here:
 http://perl.apache.org/docs/general/perl_reference/perl_reference.
html#my___Scoped_Variable_in_Nested_Subroutines

If you had warnings on, you would have received a message about $cgi not
staying shared.

In brief terms, what happens is that your program creates a lexical
called $cgi, then saveFile() refers to it, locking in that variable as
the $cgi that will always be referenced by saveFile().  At the end of
the script $cgi goes out of scope and disappears, but saveFile() keeps
referencing it.

In a CGI program this is not a problem, because Perl exits and the
process quits.  In mod_perl, the code gets run again and saveFile()
still refers to the original $cgi.

There are a number of ways to solve this problem, but I prefer the one I
showed you.  Explicitly passing all arguments to subs is well
established as a best practice in programming.  What you were doing with
$cgi before is basically treating it as a global.  So, I'd suggest you
turn on warnings, turn on strict, and embrace the good practice of
passing variables to your subs.

- Perrin



RE: HTTP POST: parameters empty when usingModPerl::Registry(okaywhen using ModPerl:PerlRun)...

2003-08-14 Thread Perrin Harkins
On Thu, 2003-08-07 at 23:56, Steve Bannerman wrote:
 I was thinking of the subroutine as a method on a class and that the objects
 in the class had a cgi instance associated with them.

That's a good way of doing things, but you would have to structure your
code a little differently.  If that appeals to you, I recommend you take
a look at CGI::Application.  There's an article about it here:
http://www.perl.com/pub/a/2001/06/05/cgi.html

- Perrin



RE: HTTP POST: parameters empty when using ModPerl::Registry (okay when using ModPerl:PerlRun)...

2003-08-14 Thread Steve Bannerman
Stas,

Replies below:
--
   Steve Bannerman
   [EMAIL PROTECTED]
   44.(0)1865.273866


 -Original Message-
 From: Stas Bekman [mailto:[EMAIL PROTECTED]
 Sent: 05 August 2003 18:07
 To: [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: Re: HTTP POST: parameters empty when using ModPerl::Registry
 (okay when using ModPerl:PerlRun)...


  ...snip...
 

 The docs need work, this is just a copy of mp1 registry docs, which need
 adjustments. However most things work the same way. The
 differences between
 Registry and PerlRun are easily summarizes with this diff:

 ModPerl-Registry diff -u lib/ModPerl/Registry.pm lib/ModPerl/PerlRun.pm
 --- lib/ModPerl/Registry.pm 2003-03-22 20:52:24.0 -0800
 +++ lib/ModPerl/PerlRun.pm  2003-03-22 20:52:24.0 -0800
 @@ -1,4 +1,4 @@
 -package ModPerl::Registry;
 +package ModPerl::PerlRun;

   use strict;
   use warnings FATAL = 'all';
 @@ -30,11 +30,11 @@
   make_namespace  = 'make_namespace',
   namespace_root  = 'namespace_root',
   namespace_from  = 'namespace_from_filename',
 -is_cached   = 'is_cached',
 -should_compile  = 'should_compile_if_modified',
 -flush_namespace = 'NOP',
 +is_cached   = 'FALSE',
 +should_compile  = 'TRUE',
 +flush_namespace = 'flush_namespace_normal',
   cache_table = 'cache_table_common',
 -cache_it= 'cache_it',
 +cache_it= 'NOP',
   read_script = 'read_script',
   rewrite_shebang = 'rewrite_shebang',
   set_script_name = 'set_script_name',
 @@ -53,17 +53,10 @@

 PerlRun doesn't cache the script on each request and it flushes
 the script's
 namespace on each request. You can see the actual functions in
 lib/ModPerl/RegistryCooker.pm.

Thanks, that's helpful...it shows me why PerlRun works.

However, it doesn't really explain why the root problem exists.  The way I
think about it, the creation of a new CGI object should create a new set
of slots for instance data.  Then, each request's parameters would be
stored in a slot of the new CGI instance rather than in the global set of
slots for the class of CGI instances.

Maybe I don't understand the object paradigm in perl correctly; however, I
do understand it very well in general.  Thus, it seems like a defect in
either perl (the language) or CGI.pm.  I'm guessing there's some
justification for it in performance...however, it just doesn't seem right.

Thoughts?

 If you can try to take it from
 here and see
 what the problem is (your code/registry?), that would be cool. Thanks.


Unfortunately, I don't really know how to take it from here.  I'm pretty
new to perl and very new to mod_perl.  Thus I'm reaching out to you guys
to find out if anybody has solved this problem...unfortunately,
Christopher's suggestion didn't work (unless I implemented it incorrectly).

 Also make sure you are using the latest CGI.pm (2.93 or higher is good).

I'm using CGI.pm-2.98.

Cheers


 __
 Stas BekmanJAm_pH -- Just Another mod_perl Hacker
 http://stason.org/ mod_perl Guide --- http://perl.apache.org
 mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
 http://modperlbook.org http://apache.org   http://ticketmaster.com




HTTP POST: parameters empty when using ModPerl::Registry (okay when using ModPerl:PerlRun)...

2003-08-14 Thread Steve Bannerman
All,

I apologize if this has already been covered...I looked at the archives
since May but couldn't see anything covering this (there were related items
but their solutions didn't solve this problem).

Here an explanation of the problem:

We want to post experiment results to an upload server which is running
Apache HTTP Server (2.0.46) and mod_perl (1.99_09).  When we post a sequence
of files to the server, some of them are written to the local disk and some
are not.  That is, the test fails when using ModPerl::Registry but it
succeeds when using ModPerl::PerlRun.

In analyzing which ones work and which ones do not, I wrote a quick test to
see why the transfer is not working.  From the looks of the results, it
appears that the first request handled by a particular Apache process/thread
works and that any subsequent requests handled by that thread fail.
Works means that the file in the test gets saved to disk and fail means that
a file of size 0 gets written to disk.

Below are the httpd.conf segments (working and failing), the test client
(test_client.pl) and the test server (test_server.pl which is accessible
from the /cpdn/cgi-bin location).

Any suggestions?  Thanks in advance...

Also, does it matter if I use ModPerl::PerlRun instead of ModPerl::Registry
(I have read some about this at
http://apache.perl.org/docs/2.0/api/ModPerl/Registry.html but the
documentation there is a little light).

--
Working httpd.conf
--
Location /cpdn/cgi-bin
  AllowOverride None
  SetHandler perl-script
  PerlResponseHandler ModPerl::PerlRun
  PerlOptions +ParseHeaders
  Options +ExecCGI
  Allow from All
/Location

--
Failing httpd.conf
--
Location /cpdn/cgi-bin
  AllowOverride None
  SetHandler perl-script
  PerlResponseHandler ModPerl::Registry
  PerlOptions +ParseHeaders
  Options +ExecCGI
  Allow from All
/Location

--
test_client.pl
--
#!/usr/bin/perl
use strict;

use LWP::UserAgent;
use HTTP::Request::Common;

my $postUrl = $ARGV[0];
my $file = $ARGV[1];

my $postType = 'form-data';

my $ua = new LWP::UserAgent;
my $req = POST($postUrl,
   Content_Type = $postType,
   Content = [ file = [$file] ]);

my $res = $ua-request($req);
if ($res-is_success()) {
  print POST test successful\n;
  print $res-content();
} else {
  print STDERR POST test failed;
  print STDERR code:  . $res-code() . \n;
  print STDERR message:  . $res-message() . \n;
}

--
test_server.pl
--
use strict;
use CGI qw(:standard);

my $cgi = new CGI;
saveFile();

sub saveFile {
  my $inputfile = $cgi-param('file');
  my $outputfile = /tmp/file- . $$ . - . time();
  open(FILE,$outputfile) || printError();
  my $buffer;
  while (read($inputfile,$buffer,2096)) {
print FILE $buffer;
  }
  close(FILE);
  undef $buffer;
}

sub printError {
  print header();
  print Content-type: text/plain\n;
  print Status: 500$\n;
  print Message: Internal Error\n;
  exit;
}

Cheers
--
   Steve Bannerman
   [EMAIL PROTECTED]
   44.(0)1865.273866



RE: HTTP POST: parameters empty when usingModPerl::Registry(okay when using ModPerl:PerlRun)...

2003-08-14 Thread Perrin Harkins
On Thu, 2003-08-07 at 03:36, Steve Bannerman wrote:
 So with respect to your explanation about the long running perl system, am
 I to understand that the old version of the saveFile() subroutine uses a
 reference to a different $cgi instance that the $cgi instance in the main
 body of the script?

It uses a reference to the $cgi variable that was in scope when
saveFile() was compiled.

 As I said, I'm new to perl but that seems to be an awfully strange behavior
 of the language...if true, shouldn't the compilation (of the subroutine)
 fail because it references an undeclared variable ($cgi)?

But it doesn't reference an undeclared variable; it references the
original $cgi that was available when the sub was compiled.

Closures are a feature of Perl.  You can read about them in general in
perlfaq7 and the perlref man page:
http://www.perldoc.com/perl5.8.0/pod/perlfaq7.html#What's-a-closure-
http://www.perldoc.com/perl5.8.0/pod/perlref.html

Note that those both talk a lot about anonymous subs, but any sub can be
a closure if it refers to a lexical variable defined in an enclosing
scope.

There is some mod_perl specific stuff on this here:
http://perl.apache.org/docs/general/perl_reference/perl_reference.html#my___Scoped_Variable_in_Nested_Subroutines

If you had warnings on, you would have received a message about $cgi not
staying shared.

In brief terms, what happens is that your program creates a lexical
called $cgi, then saveFile() refers to it, locking in that variable as
the $cgi that will always be referenced by saveFile().  At the end of
the script $cgi goes out of scope and disappears, but saveFile() keeps
referencing it.

In a CGI program this is not a problem, because Perl exits and the
process quits.  In mod_perl, the code gets run again and saveFile()
still refers to the original $cgi.

There are a number of ways to solve this problem, but I prefer the one I
showed you.  Explicitly passing all arguments to subs is well
established as a best practice in programming.  What you were doing with
$cgi before is basically treating it as a global.  So, I'd suggest you
turn on warnings, turn on strict, and embrace the good practice of
passing variables to your subs.

- Perrin


a new page: http://perl.apache.org/bugs

2003-08-14 Thread Stas Bekman
I've just added a new page to aid finding the way to the bug reporting info. 
The cool thing is that it's now easy to remember how to type that url and 
provides less excuses for not supplying the required info in bug reports.
It's also now has a prominent place in the menu of perl.apache.org.

  http://perl.apache.org/bugs/

__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com


Re: HTTP POST: parameters empty when using ModPerl::Registry (okaywhen using ModPerl:PerlRun)...

2003-08-14 Thread Stas Bekman
Steve Bannerman wrote:
All,

I apologize if this has already been covered...I looked at the archives
since May but couldn't see anything covering this (there were related items
but their solutions didn't solve this problem).
Here an explanation of the problem:

We want to post experiment results to an upload server which is running
Apache HTTP Server (2.0.46) and mod_perl (1.99_09).  When we post a sequence
of files to the server, some of them are written to the local disk and some
are not.  That is, the test fails when using ModPerl::Registry but it
succeeds when using ModPerl::PerlRun.
In analyzing which ones work and which ones do not, I wrote a quick test to
see why the transfer is not working.  From the looks of the results, it
appears that the first request handled by a particular Apache process/thread
works and that any subsequent requests handled by that thread fail.
Works means that the file in the test gets saved to disk and fail means that
a file of size 0 gets written to disk.
Below are the httpd.conf segments (working and failing), the test client
(test_client.pl) and the test server (test_server.pl which is accessible
from the /cpdn/cgi-bin location).
Any suggestions?  Thanks in advance...

Also, does it matter if I use ModPerl::PerlRun instead of ModPerl::Registry
(I have read some about this at
http://apache.perl.org/docs/2.0/api/ModPerl/Registry.html but the
documentation there is a little light).
The docs need work, this is just a copy of mp1 registry docs, which need 
adjustments. However most things work the same way. The differences between 
Registry and PerlRun are easily summarizes with this diff:

ModPerl-Registry diff -u lib/ModPerl/Registry.pm lib/ModPerl/PerlRun.pm
--- lib/ModPerl/Registry.pm 2003-03-22 20:52:24.0 -0800
+++ lib/ModPerl/PerlRun.pm  2003-03-22 20:52:24.0 -0800
@@ -1,4 +1,4 @@
-package ModPerl::Registry;
+package ModPerl::PerlRun;
 use strict;
 use warnings FATAL = 'all';
@@ -30,11 +30,11 @@
 make_namespace  = 'make_namespace',
 namespace_root  = 'namespace_root',
 namespace_from  = 'namespace_from_filename',
-is_cached   = 'is_cached',
-should_compile  = 'should_compile_if_modified',
-flush_namespace = 'NOP',
+is_cached   = 'FALSE',
+should_compile  = 'TRUE',
+flush_namespace = 'flush_namespace_normal',
 cache_table = 'cache_table_common',
-cache_it= 'cache_it',
+cache_it= 'NOP',
 read_script = 'read_script',
 rewrite_shebang = 'rewrite_shebang',
 set_script_name = 'set_script_name',
@@ -53,17 +53,10 @@
PerlRun doesn't cache the script on each request and it flushes the script's 
namespace on each request. You can see the actual functions in 
lib/ModPerl/RegistryCooker.pm. If you can try to take it from here and see 
what the problem is (your code/registry?), that would be cool. Thanks.

Also make sure you are using the latest CGI.pm (2.93 or higher is good).

__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com


Re: http responses piped to STDERR

2003-08-07 Thread Stas Bekman
Michael Pohl wrote:
I'm (very) occasionally seeing the output of Apache::Registry scripts sent
to STDERR instead of STDOUT.  That is, the entire http response (headers
included) appears in my error log, while nothing at all is displayed to
the client.
Could someone kick me towards what I should look into here?
Do you have this stub in all your files?

use strict;
use warnings;
or if perl  5.6

PerlWarn On
  in httpd.conf and
use strict;
  in all files.
this should reduce the numbed of inconsistent misteries to zero.

__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com


RE: HTTP POST: parameters empty when using ModPerl::Registry (okay when using ModPerl:PerlRun)...

2003-08-06 Thread Christopher Knight
try
CGI-initialize_globals();
at the begining of the script but before you use params

if you are depending on the 'use CGI' statement to initialize your params (like a 
command line script), it will cause
problems in Registry.  Thats becuase it is initialized once on the initial 'use CGI' 
and it stays in memory for the life
of the webserver.  So each time you use a script, you have to initialize the CGI 
params to your current request.

-Original Message-
From: Stas Bekman [mailto:[EMAIL PROTECTED]
Sent: Tuesday, August 05, 2003 12:07 PM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: HTTP POST: parameters empty when using ModPerl::Registry
(okay when using ModPerl:PerlRun)...


Steve Bannerman wrote:
 All,

 I apologize if this has already been covered...I looked at the archives
 since May but couldn't see anything covering this (there were related items
 but their solutions didn't solve this problem).

 Here an explanation of the problem:

 We want to post experiment results to an upload server which is running
 Apache HTTP Server (2.0.46) and mod_perl (1.99_09).  When we post a sequence
 of files to the server, some of them are written to the local disk and some
 are not.  That is, the test fails when using ModPerl::Registry but it
 succeeds when using ModPerl::PerlRun.

 In analyzing which ones work and which ones do not, I wrote a quick test to
 see why the transfer is not working.  From the looks of the results, it
 appears that the first request handled by a particular Apache process/thread
 works and that any subsequent requests handled by that thread fail.
 Works means that the file in the test gets saved to disk and fail means that
 a file of size 0 gets written to disk.

 Below are the httpd.conf segments (working and failing), the test client
 (test_client.pl) and the test server (test_server.pl which is accessible
 from the /cpdn/cgi-bin location).

 Any suggestions?  Thanks in advance...

 Also, does it matter if I use ModPerl::PerlRun instead of ModPerl::Registry
 (I have read some about this at
 http://apache.perl.org/docs/2.0/api/ModPerl/Registry.html but the
 documentation there is a little light).

The docs need work, this is just a copy of mp1 registry docs, which need
adjustments. However most things work the same way. The differences between
Registry and PerlRun are easily summarizes with this diff:

ModPerl-Registry diff -u lib/ModPerl/Registry.pm lib/ModPerl/PerlRun.pm
--- lib/ModPerl/Registry.pm 2003-03-22 20:52:24.0 -0800
+++ lib/ModPerl/PerlRun.pm  2003-03-22 20:52:24.0 -0800
@@ -1,4 +1,4 @@
-package ModPerl::Registry;
+package ModPerl::PerlRun;

  use strict;
  use warnings FATAL = 'all';
@@ -30,11 +30,11 @@
  make_namespace  = 'make_namespace',
  namespace_root  = 'namespace_root',
  namespace_from  = 'namespace_from_filename',
-is_cached   = 'is_cached',
-should_compile  = 'should_compile_if_modified',
-flush_namespace = 'NOP',
+is_cached   = 'FALSE',
+should_compile  = 'TRUE',
+flush_namespace = 'flush_namespace_normal',
  cache_table = 'cache_table_common',
-cache_it= 'cache_it',
+cache_it= 'NOP',
  read_script = 'read_script',
  rewrite_shebang = 'rewrite_shebang',
  set_script_name = 'set_script_name',
@@ -53,17 +53,10 @@

PerlRun doesn't cache the script on each request and it flushes the script's
namespace on each request. You can see the actual functions in
lib/ModPerl/RegistryCooker.pm. If you can try to take it from here and see
what the problem is (your code/registry?), that would be cool. Thanks.

Also make sure you are using the latest CGI.pm (2.93 or higher is good).

__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com




RE: HTTP POST: parameters empty when using ModPerl::Registry (okay when using ModPerl:PerlRun)...

2003-08-06 Thread Steve Bannerman
Christopher,

Thanks for the suggestion; unfortunately, it doesn't work.  I made the
change you suggested (inserting CGI-initialize_globals(); just before
creating an instance of CGI) and restarted apache/httpd.  The same
result...the first time the script executes it saves the file
properly...after that, a file is created with 0 size.

Besides, as you (and others prescribing the use of initialize_globals())
described it, shouldn't subsequent executions of the script write the same
file as the first execution.  That is, if the parameters of the CGI
instances are actually global, wouldn't the same array of bytes still be in
the global 'file' parameter?

Cheers
--
   Steve Bannerman
   [EMAIL PROTECTED]
   44.(0)1865.273866


 -Original Message-
 From: Christopher Knight [mailto:[EMAIL PROTECTED]
 Sent: 05 August 2003 18:20
 To: Stas Bekman; [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: RE: HTTP POST: parameters empty when using ModPerl::Registry
 (okay when using ModPerl:PerlRun)...


 try
 CGI-initialize_globals();
 at the begining of the script but before you use params

 if you are depending on the 'use CGI' statement to initialize
 your params (like a command line script), it will cause
 problems in Registry.  Thats becuase it is initialized once on
 the initial 'use CGI' and it stays in memory for the life
 of the webserver.  So each time you use a script, you have to
 initialize the CGI params to your current request.

 -Original Message-
 From: Stas Bekman [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, August 05, 2003 12:07 PM
 To: [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: Re: HTTP POST: parameters empty when using ModPerl::Registry
 (okay when using ModPerl:PerlRun)...


 Steve Bannerman wrote:
  All,
 
  I apologize if this has already been covered...I looked at the archives
  since May but couldn't see anything covering this (there were
 related items
  but their solutions didn't solve this problem).
 
  Here an explanation of the problem:
 
  We want to post experiment results to an upload server which
 is running
  Apache HTTP Server (2.0.46) and mod_perl (1.99_09).  When we
 post a sequence
  of files to the server, some of them are written to the local
 disk and some
  are not.  That is, the test fails when using ModPerl::Registry but it
  succeeds when using ModPerl::PerlRun.
 
  In analyzing which ones work and which ones do not, I wrote a
 quick test to
  see why the transfer is not working.  From the looks of the results, it
  appears that the first request handled by a particular Apache
 process/thread
  works and that any subsequent requests handled by that thread fail.
  Works means that the file in the test gets saved to disk and
 fail means that
  a file of size 0 gets written to disk.
 
  Below are the httpd.conf segments (working and failing), the test client
  (test_client.pl) and the test server (test_server.pl which is accessible
  from the /cpdn/cgi-bin location).
 
  Any suggestions?  Thanks in advance...
 
  Also, does it matter if I use ModPerl::PerlRun instead of
 ModPerl::Registry
  (I have read some about this at
  http://apache.perl.org/docs/2.0/api/ModPerl/Registry.html but the
  documentation there is a little light).

 The docs need work, this is just a copy of mp1 registry docs, which need
 adjustments. However most things work the same way. The
 differences between
 Registry and PerlRun are easily summarizes with this diff:

 ModPerl-Registry diff -u lib/ModPerl/Registry.pm lib/ModPerl/PerlRun.pm
 --- lib/ModPerl/Registry.pm 2003-03-22 20:52:24.0 -0800
 +++ lib/ModPerl/PerlRun.pm  2003-03-22 20:52:24.0 -0800
 @@ -1,4 +1,4 @@
 -package ModPerl::Registry;
 +package ModPerl::PerlRun;

   use strict;
   use warnings FATAL = 'all';
 @@ -30,11 +30,11 @@
   make_namespace  = 'make_namespace',
   namespace_root  = 'namespace_root',
   namespace_from  = 'namespace_from_filename',
 -is_cached   = 'is_cached',
 -should_compile  = 'should_compile_if_modified',
 -flush_namespace = 'NOP',
 +is_cached   = 'FALSE',
 +should_compile  = 'TRUE',
 +flush_namespace = 'flush_namespace_normal',
   cache_table = 'cache_table_common',
 -cache_it= 'cache_it',
 +cache_it= 'NOP',
   read_script = 'read_script',
   rewrite_shebang = 'rewrite_shebang',
   set_script_name = 'set_script_name',
 @@ -53,17 +53,10 @@

 PerlRun doesn't cache the script on each request and it flushes
 the script's
 namespace on each request. You can see the actual functions in
 lib/ModPerl/RegistryCooker.pm. If you can try to take it from here and see
 what the problem is (your code/registry?), that would be cool. Thanks.

 Also make sure you are using the latest CGI.pm (2.93 or higher is good).

 __
 Stas BekmanJAm_pH -- Just Another mod_perl Hacker
 http://stason.org/ mod_perl Guide --- http

RE: HTTP POST: parameters empty when using ModPerl::Registry(okay when using ModPerl:PerlRun)...

2003-08-06 Thread Perrin Harkins
On Wed, 2003-08-06 at 04:50, Steve Bannerman wrote:
 However, it doesn't really explain why the root problem exists.  The way I
 think about it, the creation of a new CGI object should create a new set
 of slots for instance data.

That would make sense, but very little about CGI.pm actually works in
the way you would expect.  It's a very bizarre module because of the
dual functional and object interface, and it uses lots of globals even
if you are only calling the OO interface.  If possible, I would suggest
you consider using CGI::Simple instead, which is a drop-in replacement.

 Maybe I don't understand the object paradigm in perl correctly; however, I
 do understand it very well in general.  Thus, it seems like a defect in
 either perl (the language) or CGI.pm.

It's a problem with CGI.pm, not with your understanding of Perl OO.

I believe I see the source of your troubles in the code that you
posted.  You are creating a closure by using a lexical variable and then
accessing it from within a sub.  This is a no-no with any long-running
system like mod_perl.  You can get away with it in a standard CGI
environment (or PerlRun) because it just exits after each request
instead of running the same code again.

Here is the offending section:

my $cgi = new CGI;
saveFile();

sub saveFile {
  my $inputfile = $cgi-param('file');
... etc ...
}

Change it to this:

my $cgi = new CGI;
saveFile($cgi);

sub saveFile {
  my $cgi = shift;
  my $inputfile = $cgi-param('file');
... etc ...
}

I think that will do it.

- Perrin


http responses piped to STDERR

2003-07-29 Thread Michael Pohl
I'm (very) occasionally seeing the output of Apache::Registry scripts sent
to STDERR instead of STDOUT.  That is, the entire http response (headers
included) appears in my error log, while nothing at all is displayed to
the client.

Could someone kick me towards what I should look into here?

thanks,

michael



authentication realms in http and https

2003-07-18 Thread Jason Fong
We recently upgraded our webserver from 
Apache 1.3.6 / modperl 1.19
to
Apache 1.3.27 / modperl 1.27

We use a .htaccess file in a directory to have a modperl script do
authentication for directory access (for downloading files, etc.).  It
also redirects the user from http to https if he does not come in
through https.

On our old server, the user would only see the browser's login box once
when he came in through http and was redirected to https.  On the new
server, however, the user has to login twice.  But if the user comes in
through https on the new server, there is only one login.

So my guess is that the new server is not treating an authentication
realm in http as the same as one in https.

So, my question is... Is this different treatment of the http/https
authentication realms something that changed in the newer version of
modperl (or possibly apache)?  Or is this something that can be changed
through configuration options?  (and also... is my analysis even
correct? :) )  Thanks!


-Jason Fong



[ANNOUNCE] HTTP-WebTest 2.03

2003-07-14 Thread Ilya Martynov

The URL

http://martynov.org/tgz/HTTP-WebTest-2.03.tar.gz

has entered CPAN as

  file: $CPAN/authors/id/I/IL/ILYAM/HTTP-WebTest-2.03.tar.gz
  size: 90135 bytes
   md5: cc49ade2d6955fa20dd30b1f88862943



NAME
HTTP::WebTest - Testing static and dynamic web content

DESCRIPTION

This module runs tests on remote URLs containing
Perl/JSP/HTML/JavaScript/etc. and generates a detailed test report. This
module can be used as-is or its functionality can be extended using
plugins. Plugins can define test types and provide additional report
capabilities. This module comes with a set of default plugins but can be
easily extended with third party plugins.



CHANGES SINCE LAST STABLE VERSION 2.02:

ENHANCEMENTS:

* New test parameters 'mail_success_subject' and
'mail_failure_subject' to redefine default value of Subject field in
test report emails.  Based on patch by Amit Kaul.

BUG FIXES:

* HTTP::WebTest used to mangle test URLs like
'http://website.com?http://website2.com?var=val' by URL escaping
everything after the first question mark. Now it does modify test URL
unless it is doing GET request and test parameter 'params' is
specified in a test specification.  Thanks to Brian Webb for a
bugreport.


-- 
Ilya Martynov,  [EMAIL PROTECTED]
CTO IPonWEB (UK) Ltd
Quality Perl Programming and Unix Support
UK managed @ offshore prices - http://www.iponweb.net
Personal website - http://martynov.org



RE: Convert Cookies--HTTP Request Headers?

2003-04-06 Thread Kruse, Matt
Title: RE: Convert Cookies--HTTP Request Headers?





From: Brian Reichert
Ok, I'm confused: the cookies are already in the request header,
and you want to 'convert' them into a request header?


Well, yes. Two reasons:
1) In the real production environment, the cookie is encrypted and validated against a database with each request. My app knows nothing about the cookie. All it ever sees is the request headers.

2) I wanted to use a cookie simply because it's the easiest way to dynamically control the contents of the headers to be sent, and the easiest way I could think of that would work with a login page.

I assumed people would think it was an odd request, but it does make sense :)


From: Juha-Mikko Ahonen
Why name NAME to HTTP_NAME? Or do you want the cookie content to appear
in subprocess environment (which has similar naming convention), like
other server variables?


Actually, this was an oversight, I'm used to CGI!!


 2. Writing some sample code :)
package Your::SSOHandler;


Thank you! This is exactly the kind of example I needed. Will test ASAP, and adjust to fit my specific needs. I'm quite familiar with Perl, it's mainly the API's that I'm clueless about. Your code makes sense and at least points me in exactly the right direction.

For testing you could make the handler module stat and evaluate contents
of an external Perl file. Put your code on the file to be evaluated,
and avoid restarts.


True, that would work also, but it would still require modifying a file each time. 
With this cookie solution, I can create a fake login page which will set the appropriate cookies in _javascript_ and also allow for simulating logout by clearing the cookie.

Matt





Re: Convert Cookies--HTTP Request Headers?

2003-04-05 Thread Juha-Mikko Ahonen
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On Saturday 05 April 2003 00:10, Kruse, Matt wrote:
 For every request to Apache:
   1. Parse the cookie coming in via the request header
   2. Pull out each value (ex: NAME=bob;TITLE=boss)
   3. Convert them to HTTP Request Headers
   4. Pass the request on to the requested resource (a script of some
 sort)

You'd need to write PerlHeaderParserHandler for that.

 So, if I have a cookie like: NAME=bob;TITLE=boss
 My program would then see the following headers in the request:
   HTTP_NAME=bob
   HTTP_TITLE=boss

Why name NAME to HTTP_NAME? Or do you want the cookie content to appear 
in subprocess environment (which has similar naming convention), like 
other server variables?

 This will help me simulate a Single-Sign-On situation where the
 authentication handler passes all authenticated user information to
 the resource via headers.

 Can anyone help me by either:
   1. Giving an outline of what handlers I would want to use, and how
 I can write request headers with them
 or

The header parse phase would be ideal, since you're parsing headers. 
PerlInitHandler is an alias PerlHeaderParserHandler in .htaccess files.

   2. Writing some sample code :)

package Your::SSOHandler;

use strict;
use Apache::Constants qw(:common);
use Apache::Cookie;

sub handler {
my $r = shift;
my $in = $r-headers_in;
return DECLINED unless $in-{'Cookie'};
my $cookies = Apache::Cookie-parse($in-{'Cookie'});
return DECLINED unless $cookies{'YourAuthenticationCookie'};

my %values = $cookies{'YourAuthenticationCookie'}-value;
my $env = $r-subprocess_env;

while (my ($key, $value) = each %values) {
my $h_key = 'HTTP_' . uc($key);
$in-{$h_key} = $value;
$env-{$h_key} = $value;
}

return OK;
}

1;

in httpd.conf (or .htaccess), put the following line where approppriate:

PerlModule Your::SSOHandler
PerlHeaderParserHandler Your::SSOHandler

Or something like that. Cutting and pasting may cause parse errors on 
incompatible windowing environments :)

 NOTES:
   1. I'm running Apache 2.0 and mod_perl 2 right now, but I can bump
 it down if required

I don't know much about the differences in mod_perl 1 vs 2. These 
handlers work at least for Apache/mod_perl 1.

   2. I've already used mod_headers to simulate this, but
 unfortunately that isn't dynamic enough for testing, ie, I need to
 change httpd.conf and re-start the server to test different header
 scenarios.

For testing you could make the handler module stat and evaluate contents 
of an external Perl file. Put your code on the file to be evaluated, 
and avoid restarts.

Or simply sending SIGUSR1 to the Apache parent process should be enough 
for it to restart child processes and reread configuration.
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org

iD8DBQE+jp7eWD8Ca88cV68RAuBAAJ9u0KWd2bAsHrYes/DXtareCYi00gCgkIEC
o8OTRNmghIHRUhJZAqX+gbs=
=YCIq
-END PGP SIGNATURE-



Re: Convert Cookies--HTTP Request Headers?

2003-04-05 Thread Michael Robinton
On Fri, Apr 04, 2003 at 04:10:03PM -0500, Kruse, Matt wrote:
 I have a unique need purely for testing purposes. I'm not very familiar
 (yet) with mod_perl handlers in Apache, so I've had a rough time
getting
 anything going.
 Here is my goal:

 For every request to Apache:
   1. Parse the cookie coming in via the request header
   2. Pull out each value (ex: NAME=bob;TITLE=boss)
   3. Convert them to HTTP Request Headers

Ok, I'm confused: the cookies are already in the request header,
and you want to 'convert' them into a request header?

   4. Pass the request on to the requested resource (a script of some
sort)

 So, if I have a cookie like: NAME=bob;TITLE=boss
 My program would then see the following headers in the request:
   HTTP_NAME=bob
   HTTP_TITLE=boss

If you're using an Apache handler, see Apache::Cookie for unpeeling
cookies.

If you're running a classic CGI program, see CGI::Cookie for unpeeling
cookies.

 This will help me simulate a Single-Sign-On situation where the
 authentication handler passes all authenticated user information to the
 resource via headers.

When you say 'HTTP request headers', did you really mean to say 'CGI
parameters', as the CGI module uses the term?

 Thanks!

 Matt Kruse

Also see:   Apache::FakeCookie on CPAN

for testing cookies without having to load httpd. It replaces the httpd
server for generating cookie responses during development and testing of
Apache-perl modules

Michael



Convert Cookies--HTTP Request Headers?

2003-04-04 Thread Kruse, Matt
Title: Convert Cookies--HTTP Request Headers?





I have a unique need purely for testing purposes. I'm not very familiar (yet) with mod_perl handlers in Apache, so I've had a rough time getting anything going.

Here is my goal:


For every request to Apache:
 1. Parse the cookie coming in via the request header
 2. Pull out each value (ex: NAME=bob;TITLE=boss)
 3. Convert them to HTTP Request Headers
 4. Pass the request on to the requested resource (a script of some sort)


So, if I have a cookie like: NAME=bob;TITLE=boss
My program would then see the following headers in the request:
 HTTP_NAME=bob
 HTTP_TITLE=boss


This will help me simulate a Single-Sign-On situation where the authentication handler passes all authenticated user information to the resource via headers.

Can anyone help me by either:
 1. Giving an outline of what handlers I would want to use, and how I can write request headers with them
or
 2. Writing some sample code :)


NOTES:
 1. I'm running Apache 2.0 and mod_perl 2 right now, but I can bump it down if required
 2. I've already used mod_headers to simulate this, but unfortunately that isn't dynamic enough for testing, ie, I need to change httpd.conf and re-start the server to test different header scenarios.

Thanks!


Matt Kruse





Re: Convert Cookies--HTTP Request Headers?

2003-04-04 Thread Brian Reichert
On Fri, Apr 04, 2003 at 04:10:03PM -0500, Kruse, Matt wrote:
 I have a unique need purely for testing purposes. I'm not very familiar
 (yet) with mod_perl handlers in Apache, so I've had a rough time getting
 anything going.
 Here is my goal:
 
 For every request to Apache:
   1. Parse the cookie coming in via the request header
   2. Pull out each value (ex: NAME=bob;TITLE=boss)
   3. Convert them to HTTP Request Headers

Ok, I'm confused: the cookies are already in the request header,
and you want to 'convert' them into a request header?

   4. Pass the request on to the requested resource (a script of some sort)
 
 So, if I have a cookie like: NAME=bob;TITLE=boss
 My program would then see the following headers in the request:
   HTTP_NAME=bob
   HTTP_TITLE=boss

If you're using an Apache handler, see Apache::Cookie for unpeeling
cookies.

If you're running a classic CGI program, see CGI::Cookie for unpeeling
cookies.

 This will help me simulate a Single-Sign-On situation where the
 authentication handler passes all authenticated user information to the
 resource via headers.

When you say 'HTTP request headers', did you really mean to say 'CGI
parameters', as the CGI module uses the term?

 Thanks!
 
 Matt Kruse

-- 
Brian 'you Bastard' Reichert[EMAIL PROTECTED]
37 Crystal Ave. #303Daytime number: (603) 434-6842
Derry NH 03038-1713 USA BSD admin/developer at large


Re: [ANNOUNCE] HTTP-WebTest 2.02

2003-03-24 Thread Ilya Martynov

The URL

http://martynov.org/tgz/HTTP-WebTest-2.02.tar.gz

has entered CPAN as

  file: $CPAN/authors/id/I/IL/ILYAM/HTTP-WebTest-2.02.tar.gz
  size: 87462 bytes
   md5: 20478775a4bafb6c5cad2ca1fcd4e9ea


NAME
HTTP::WebTest - Testing static and dynamic web content

DESCRIPTION

This module runs tests on remote URLs containing
Perl/JSP/HTML/JavaScript/etc. and generates a detailed test report. This
module can be used as-is or its functionality can be extended using
plugins. Plugins can define test types and provide additional report
capabilities. This module comes with a set of default plugins but can be
easily extended with third party plugins.



CHANGES SINCE LAST STABLE VERSION 2.01:

BUG FIXES:

* Fixed bug when test reports were not send when multiple email
addresses were specified as test report recipients (test parameter
'mail_addresses').  Thanks to Amit Kaul for a bugreport and a patch.

* New versions of LWP add Client-Peer header in all responses what
breaks one of tests in the self-testing suite.  This test was supposed
to be fixed by the previous release but apparently it wasn't.


-- 
Ilya Martynov,  [EMAIL PROTECTED]
CTO IPonWEB (UK) Ltd
Quality Perl Programming and Unix Support
UK managed @ offshore prices - http://www.iponweb.net
Personal website - http://martynov.org


[Http-webtest-general] [ANNOUNCE] HTTP-WebTest-Plugin-TagAttTest-1.00

2003-03-14 Thread Ed Fancher



The uploaded file 
HTTP-WebTest-Plugin-TagAttTest-1.00.tar.gzhas entered CPAN 
as file: 
$CPAN/authors/id/E/EF/EFANCHE/HTTP-WebTest-Plugin-TagAttTest-1.00.tar.gz 
size: 5312 bytes md5: 940013aada679fdc09757f119d70686e


NAME HTTP::WebTest ::Plugin::TagAttTest - WebTest 
plugin providing a higher level tag and attribute search interface.

DESCRIPTION see also http://search.cpan.org/search?query=HTTP%3A%3AWebTestmode=all 
This module is a plugin extending the functionality of the WebTest module to 
allow tests of the form:
my $webpage='http://www.ethercube.net';my @result;

@result = (@result, {test_name = "title 
junk",url 
= 
$webpage,tag_forbid 
= [{ tag="title", tag_text="junk"}]});@result = (@result, 
{test_name = "title test 
page",url 
= 
$webpage,tag_require 
= [{tag= "title", text="test page"}]});@result = (@result, 
{test_name = "type att with xml in 
value",url 
= 
$webpage,tag_forbid 
= [{attr="type", attr_text="xml" }]});@result = (@result, 
{test_name = "type class with body in 
value",url 
= 
$webpage,tag_require 
= [{attr="class", attr_text="body" }]});@result = 
(@result, {test_name = "class 
att",url 
= 
$webpage,tag_require 
= [{attr="class"}]});@result = (@result, {test_name 
= "script 
tag",url 
=$webpage,tag_forbid 
= [{tag= "script"}]});@result = (@result, {test_name = 
"script tag with attribute 
language=_javascript_",url 
= 
$webpage,tag_forbid 
= 
[{tag="script",attr="language",attr_text="_javascript_"}]});my 
[EMAIL PROTECTED];

 my $params = { 
 
plugins = 
["::FileRequest","HTTP::WebTest::Plugin::TagAttTest"] 
};my $webtest= HTTP::WebTest-new;#4check_webtest(webtest 
=$webtest, tests= $tests,opts=$params, 
check_file='t/test.out/1.out');#$webtest-run_tests( 
$tests,$params);

Ed FancherEthercube Solutionshttp://www.ethercube.netPHP, Perl, 
MySQL, _javascript_ solutions.


[mp2] changing http:// to https: in TransHandler

2003-03-08 Thread beau
Hi -

I'm not much of a mod_perl scripter (yet), but having been
totally defeated my mod_rewrite, I am trying to use mod_perl
to push clients into using https when accessing a particular
server (I am using named-based virtual hosting).

I want to do something like this (the real one will be
more complicated - but this is a baby test):

-in httpd.conf-

PerlTransHandler +MyApache::ForceSecure

-handler-

package MyApache::ForceSecure;
use strict;
use warnings;
use Apache::RequestRec ();
use Apache::Const -compile = qw(DECLINED);

sub handler 
{
  my $r = shift;
  my $url = $r-url;
  if ($url =~ m{^http://bcbk}i) {
$url =~ s/^http:/https:/i;
$r-url ($url);
  }
  return Apache::DECLINED;
}
1;

Which is great, but there is *no* $r-url. I know there is a $r-uri, but
how can I get to the whole ball of wax: from http://...? I can't find
it in the docs.

Aloha = Beau;





Re: [mp2] changing http:// to https: in TransHandler

2003-03-08 Thread Nick Tonkin
On Sat, 8 Mar 2003 [EMAIL PROTECTED] wrote:

 Hi -

 I'm not much of a mod_perl scripter (yet), but having been
 totally defeated my mod_rewrite, I am trying to use mod_perl
 to push clients into using https when accessing a particular
 server (I am using named-based virtual hosting).

 I want to do something like this (the real one will be
 more complicated - but this is a baby test):

 -in httpd.conf-

 PerlTransHandler +MyApache::ForceSecure

 -handler-

 package MyApache::ForceSecure;
 use strict;
 use warnings;
 use Apache::RequestRec ();
 use Apache::Const -compile = qw(DECLINED);

 sub handler
 {
   my $r = shift;
   my $url = $r-url;
   if ($url =~ m{^http://bcbk}i) {
 $url =~ s/^http:/https:/i;
 $r-url ($url);
   }
   return Apache::DECLINED;
 }
 1;

 Which is great, but there is *no* $r-url. I know there is a $r-uri, but
 how can I get to the whole ball of wax: from http://...? I can't find
 it in the docs.

 Aloha = Beau;

Beau:

I _just_ went through this on my system. You would probably want to use
the following to change the URI as you wish:

my $uri = APR::URI-parse($r-pool, $r-construct_url);
$uri-scheme('https');
my $new_uri = $uri-unparse;

However, the overall strategy is probably not what you want, due to the
way SSL works. When a browser requests a secure connection, the SSL
connection (to the secure port) is established _before_ even the HTTP
connection. Thus it is impossible to change the scheme (http vs https)
once you have arrived at your server. The only way to do this with a Perl
handler is to generate a 302 external redirect.

mod_rewrite can be complicated, sure, but I do think it's the way to
go in this situation. You need:

- two sub-domains in DNS, let's say www.my_domain.com and secure.my_domain.com
- a sub-directory /secure in your webdocs root (or something else able to matched with 
a regex)
- the following in your httpd.conf:

Listen 80
Listen 443
NameVirtualHost 12.34.56.789:80
NameVirtualHost 12.34.56.789:443

VirtualHost 12.34.56.789:80

ServerName   www.my_domain.com
RewriteEngine   on
RewriteCond  %{REQUEST_URI}  /secure/
RewriteRule  ^/(.*)$   https://secure.my_domain.com/$1 [R,L]

/VirtualHost

VirtualHost 12.34.56.789:443

ServerName   secure.my_domain.com
RewriteEngine   on
RewriteCond  %{REQUEST_URI}  !/secure
RewriteRule  ^/(.*)$   http://www.my_domain.com/$1 [R,L]

/VirtualHost

This allows you to have relative links on all your pages. All links on
www.my_domain.com will point to http://www. on port 80, and all links on
secure.my_domain.com will point to https://secure. on port 443. The server
will simply rewrite and redirect all links that do not match either
/secure/ or !/secure.

Hope this helps,

- nick

PS If you have more than one domain needing to use https, you can put it
on an arbitrary port so long as you configure the server (not apache) to
listen on it, and then hard-code the port number in the mod_rewrite rule.

-- 


Nick Tonkin   {|8^)



Re: [mp2] changing http:// to https: in TransHandler

2003-03-08 Thread beau
On 8 Mar 2003 at 6:45, Nick Tonkin wrote:

 On Sat, 8 Mar 2003 [EMAIL PROTECTED] wrote:
 
  Hi -
 
  I'm not much of a mod_perl scripter (yet), but having been
  totally defeated my mod_rewrite, I am trying to use mod_perl
  to push clients into using https when accessing a particular
  server (I am using named-based virtual hosting).
  [...]
 
 I _just_ went through this on my system. You would probably want to use
 the following to change the URI as you wish:
 
 my $uri = APR::URI-parse($r-pool, $r-construct_url);
 $uri-scheme('https');
 my $new_uri = $uri-unparse;
 
 However, the overall strategy is probably not what you want, due to the
 way SSL works. When a browser requests a secure connection, the SSL
 connection (to the secure port) is established _before_ even the HTTP
 connection. Thus it is impossible to change the scheme (http vs https)
 once you have arrived at your server. The only way to do this with a Perl
 handler is to generate a 302 external redirect.
 
 mod_rewrite can be complicated, sure, but I do think it's the way to
 go in this situation. You need:
 [...] 
 
 Nick Tonkin   {|8^)
 

Thank you Nick for your detailed and informative reply! Back to mod_rewrite ;)
I'll see if I can get thru the virtual host/mod_rewrite maze...and
let you know.

Thanks and Aloha = Beau;





Re: [mp2] changing http:// to https: in TransHandler

2003-03-08 Thread Jason Galea
sorry if OT..

Hi Nick,

please tell me I'm wrong (I'll be a happy camper), but I thought that you 
couldn't use name virtual server for SSL.

Name server requires HTTP/1.1 which supplies a Host header so the server can 
tell which virtual server you want. With SSL this header is encrypted so 
apache can't read it to know which virtual server it's for.

Or does it work this way by defaulting to the first virtual server listening 
on port 443?

Or is Apache2 doing something funky to make this work?

..again, I really would like to be wrong about this. I host from home on ADSL 
and thought I'd have to pay for more IP's if I wanted to secure a section of 
my site.

J

Nick Tonkin wrote:
On Sat, 8 Mar 2003 [EMAIL PROTECTED] wrote:


Hi -

I'm not much of a mod_perl scripter (yet), but having been
totally defeated my mod_rewrite, I am trying to use mod_perl
to push clients into using https when accessing a particular
server (I am using named-based virtual hosting).
I want to do something like this (the real one will be
more complicated - but this is a baby test):
-in httpd.conf-

PerlTransHandler +MyApache::ForceSecure

-handler-

package MyApache::ForceSecure;
use strict;
use warnings;
use Apache::RequestRec ();
use Apache::Const -compile = qw(DECLINED);
sub handler
{
 my $r = shift;
 my $url = $r-url;
 if ($url =~ m{^http://bcbk}i) {
   $url =~ s/^http:/https:/i;
   $r-url ($url);
 }
 return Apache::DECLINED;
}
1;
Which is great, but there is *no* $r-url. I know there is a $r-uri, but
how can I get to the whole ball of wax: from http://...? I can't find
it in the docs.
Aloha = Beau;


Beau:

I _just_ went through this on my system. You would probably want to use
the following to change the URI as you wish:
my $uri = APR::URI-parse($r-pool, $r-construct_url);
$uri-scheme('https');
my $new_uri = $uri-unparse;
However, the overall strategy is probably not what you want, due to the
way SSL works. When a browser requests a secure connection, the SSL
connection (to the secure port) is established _before_ even the HTTP
connection. Thus it is impossible to change the scheme (http vs https)
once you have arrived at your server. The only way to do this with a Perl
handler is to generate a 302 external redirect.
mod_rewrite can be complicated, sure, but I do think it's the way to
go in this situation. You need:
- two sub-domains in DNS, let's say www.my_domain.com and secure.my_domain.com
- a sub-directory /secure in your webdocs root (or something else able to matched with 
a regex)
- the following in your httpd.conf:
Listen 80
Listen 443
NameVirtualHost 12.34.56.789:80
NameVirtualHost 12.34.56.789:443
VirtualHost 12.34.56.789:80

ServerName   www.my_domain.com
RewriteEngine   on
RewriteCond  %{REQUEST_URI}  /secure/
RewriteRule  ^/(.*)$   https://secure.my_domain.com/$1 [R,L]
/VirtualHost

VirtualHost 12.34.56.789:443

ServerName   secure.my_domain.com
RewriteEngine   on
RewriteCond  %{REQUEST_URI}  !/secure
RewriteRule  ^/(.*)$   http://www.my_domain.com/$1 [R,L]
/VirtualHost

This allows you to have relative links on all your pages. All links on
www.my_domain.com will point to http://www. on port 80, and all links on
secure.my_domain.com will point to https://secure. on port 443. The server
will simply rewrite and redirect all links that do not match either
/secure/ or !/secure.
Hope this helps,

- nick

PS If you have more than one domain needing to use https, you can put it
on an arbitrary port so long as you configure the server (not apache) to
listen on it, and then hard-code the port number in the mod_rewrite rule.



Re: [mp2] changing http:// to https: in TransHandler

2003-03-08 Thread beau
On 9 Mar 2003 at 10:53, Jason Galea wrote:

 sorry if OT..
 
 Hi Nick,
 
 please tell me I'm wrong (I'll be a happy camper), but I thought that you 
 couldn't use name virtual server for SSL.
 
 Name server requires HTTP/1.1 which supplies a Host header so the server can 
 tell which virtual server you want. With SSL this header is encrypted so 
 apache can't read it to know which virtual server it's for.
 
 Or does it work this way by defaulting to the first virtual server listening 
 on port 443?
 
 Or is Apache2 doing something funky to make this work?
 
 ..again, I really would like to be wrong about this. I host from home on ADSL 
 and thought I'd have to pay for more IP's if I wanted to secure a section of 
 my site.
 
 J
 
 
 Nick Tonkin wrote:
  [...]
  
  Beau:
  
  [...]
  
  mod_rewrite can be complicated, sure, but I do think it's the way to
  go in this situation. You need:
  
  - two sub-domains in DNS, let's say www.my_domain.com and secure.my_domain.com
  - a sub-directory /secure in your webdocs root (or something else able to matched 
  with a regex)
  - the following in your httpd.conf:
  
  Listen 80
  Listen 443
  NameVirtualHost 12.34.56.789:80
  NameVirtualHost 12.34.56.789:443
  
  VirtualHost 12.34.56.789:80
  
  ServerName   www.my_domain.com
  RewriteEngine   on
  RewriteCond  %{REQUEST_URI}  /secure/
  RewriteRule  ^/(.*)$   https://secure.my_domain.com/$1 [R,L]
  
  /VirtualHost
  
  VirtualHost 12.34.56.789:443
  
  ServerName   secure.my_domain.com
  RewriteEngine   on
  RewriteCond  %{REQUEST_URI}  !/secure
  RewriteRule  ^/(.*)$   http://www.my_domain.com/$1 [R,L]
  
  /VirtualHost
  
  This allows you to have relative links on all your pages. All links on
  www.my_domain.com will point to http://www. on port 80, and all links on
  secure.my_domain.com will point to https://secure. on port 443. The server
  will simply rewrite and redirect all links that do not match either
  /secure/ or !/secure.
  
  Hope this helps,
  
  - nick
  
  PS If you have more than one domain needing to use https, you can put it
  on an arbitrary port so long as you configure the server (not apache) to
  listen on it, and then hard-code the port number in the mod_rewrite rule.
  
 

I'm not Nick and you're wrong! :)

Just follow Nick's cookbook above, and it will work.
I put all of my non-global SSL directives within the
secure vhost block.

You may have to tweak it your your particular needs,
but, hey, that's fun anyway...

Aloha = Beau;



Re: [mp2] changing http:// to https: in TransHandler

2003-03-08 Thread Nick Tonkin
On Sun, 9 Mar 2003, Jason Galea wrote:

 sorry if OT..

Yes, it's OT. Please take SSL questions to an ssl-related list. Or, since
the previous post contained cut-n-paste instructions, you could have tried
it! :)

 please tell me I'm wrong (I'll be a happy camper), but I thought that you
 couldn't use name virtual server for SSL.

The basic answer to your question is that you only need unique IP-port
combinations to run multiple SSL virtual hosts using NameVirtualHost.
However, requests to any SSL virtual host other than the one running on
port 443 (the standard https port) will have to specify the port in the
request.

I suggest spending some time with the docs for mod_ssl, if that's what
you're using.


- nick

-- 


Nick Tonkin   {|8^)



[ANNOUNCE] HTTP-WebTest 2.01

2003-03-02 Thread Ilya Martynov
The URL

http://martynov.org/tgz/HTTP-WebTest-2.01.tar.gz

has entered CPAN as

  file: $CPAN/authors/id/I/IL/ILYAM/HTTP-WebTest-2.01.tar.gz
  size: 87180 bytes
   md5: d48ea08bd9bb7e4dca52d266f632672f


NAME
HTTP::WebTest - Testing static and dynamic web content

DESCRIPTION

This module runs tests on remote URLs containing
Perl/JSP/HTML/JavaScript/etc. and generates a detailed test report. This
module can be used as-is or its functionality can be extended using
plugins. Plugins can define test types and provide additional report
capabilities. This module comes with a set of default plugins but can be
easily extended with third party plugins.



CHANGES SINCE LAST STABLE VERSION 2.00:

ENHANCEMENTS:

* Port self-testing suite from Test to Test::More.

* HTTP::WebTest allows to specify non-default parser for wtscript
files.

* Now HTTP::WebTest::Parser can generate wtscript files from test
parameters.

DEPRECATIONS:

* Subroutines 'start_webserver' and 'stop_webserver' was moved from
HTTP::WebTest::SelfTest to HTTP::WebTest::Utils.  They still can be
exported from HTTP::WebTest::SelfTest but their usage from this module
is deprecated.

BUG FIXES:

* New versions of LWP add Client-Peer header in all responses what
breaks one of tests in self-testing suite. This test is fixed so it
should pass ok with both new and old versions of LWP.  Thanks to
Johannes la Poutre for bug report.

* Test in self-testing suite for 'timeout' parameter was buggy and
could fail on some machines.  Thanks to Johannes la Poutre for bug
report.

* HTTP::WebTest::Plugin::HarnessReport produced report output on
STDERR what was adding noise in 'make test' output when being used in
Test::Harness-style test suites.


-- 
Ilya Martynov,  [EMAIL PROTECTED]
CTO IPonWEB (UK) Ltd
Quality Perl Programming and Unix Support
UK managed @ offshore prices - http://www.iponweb.net
Personal website - http://martynov.org



subrequest-run() doesn't send HTTP headers

2003-02-26 Thread rm
Hello list,


i'm trying to run a subrequest from within
a mod_perl content handler. The subrequest
is build from the request's 'lookup_file()'
method. Unfortunately, when i invoke the
'run()' method of the subrequest, no HTTP
headers are sent (even so the documentation
from 'Writing Apache Modules' claims that 'run()' 
  
  ... will do
  everything a response handler is supposed to, 
  including sending the HTTP headers and the 
  document body. 

  ... you must not send the HTTP header and 
  document body yourself ...

Here's a short test case:

sub handler {
  my $req = shift;

  my $filename = /tmp/sample.html;


  my $sub = $req-lookup_file($filename);
  $status = $sub-status;
  if($status == 200)
{
  $ret = $sub-run;
}
   return ret;
}

Furthermore, if the filename given to 'lookup_uri()'
points to a directory, $sub-status will return '200'
but '$sub-run()' will return '301' (which is o.k. since
the filename should end with a '/' ...).

Any ideas?

  Raf Mattes


Re: subrequest-run() doesn't send HTTP headers

2003-02-26 Thread Geoffrey Young


[EMAIL PROTECTED] wrote:
Hello list,

i'm trying to run a subrequest from within
a mod_perl content handler. The subrequest
is build from the request's 'lookup_file()'
method. Unfortunately, when i invoke the
'run()' method of the subrequest, no HTTP
headers are sent (even so the documentation
from 'Writing Apache Modules' claims that 'run()' 
  
  ... will do
  everything a response handler is supposed to, 
  including sending the HTTP headers and the 
  document body. 

  ... you must not send the HTTP header and 
  document body yourself ...
well, the Eagle book is a little out of date here

http://marc.theaimsgroup.com/?l=apache-modperlm=96687764724849w=2

subrequests do not include headers, so if you use $sub-run() to send the 
subrequest to the client you are required to send headers yourself.

to alter this behavior, use $sub-run(1).

see Recipe 3.16 in the Cookbook, which is available for free from Sams:

http://www.samspublishing.com/catalog/article.asp?product_id={B95F1178-BE9D-43A8-8061-6E351400EF7F}

HTH

--Geoff

Here's a short test case:

sub handler {
  my $req = shift;
  my $filename = /tmp/sample.html;

  my $sub = $req-lookup_file($filename);
  $status = $sub-status;
  if($status == 200)
{
  $ret = $sub-run;
}
   return ret;
}
Furthermore, if the filename given to 'lookup_uri()'
points to a directory, $sub-status will return '200'
but '$sub-run()' will return '301' (which is o.k. since
the filename should end with a '/' ...).
Any ideas?

  Raf Mattes




Re: subrequest-run() doesn't send HTTP headers

2003-02-26 Thread Geoffrey Young

to alter this behavior, use $sub-run(1).

see Recipe 3.16 in the Cookbook
whoops, that was supposed to be 3.15.

--Geoff



Re: subrequest-run() doesn't send HTTP headers

2003-02-26 Thread rm
On Wed, Feb 26, 2003 at 10:23:53AM -0500, Geoffrey Young wrote:
 
 
 
 well, the Eagle book is a little out of date here
 
 http://marc.theaimsgroup.com/?l=apache-modperlm=96687764724849w=2
 
 subrequests do not include headers, so if you use $sub-run() to send the 
 subrequest to the client you are required to send headers yourself.

which i can't, since i have no idea about the mime-type etc. of the
file ;-/

 to alter this behavior, use $sub-run(1).

Ah, thank's a _lot_, that did it. Now, the only question is: why isn't
that documented?

 see Recipe 3.16 in the Cookbook, which is available for free from Sams:
 
 http://www.samspublishing.com/catalog/article.asp?product_id={B95F1178-BE9D-43A8-8061-6E351400EF7F}

Nice doc-pointer.

 thanks a lot  

  Ralf Mattes



Re: subrequest-run() doesn't send HTTP headers

2003-02-26 Thread Geoffrey Young

subrequests do not include headers, so if you use $sub-run() to send the 
subrequest to the client you are required to send headers yourself.
which i can't, since i have no idea about the mime-type etc. of the
file ;-/
yes you do - the subrequest found out for you :)

$r-send_http_header($sub-content_type);

to alter this behavior, use $sub-run(1).
Ah, thank's a _lot_, that did it. Now, the only question is: why isn't
that documented?
um, it is documented... if you have the Cookbook :)

--Geoff



Re: subrequest-run() doesn't send HTTP headers

2003-02-26 Thread Stas Bekman

to alter this behavior, use $sub-run(1).


Ah, thank's a _lot_, that did it. Now, the only question is: why isn't
that documented?
Please submit a documentation patch for this. It should be somewhere in 
http://perl.apache.org/docs/1.0/api/



__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com


501 - Protocol scheme 'http' is not supported

2003-02-12 Thread Db-Doc SP
Hi All,

We get the following error

501 Protocol scheme 'http' is not supported

when the perl script is excuted from perlTranshandler

my $ua = LWP::UserAgent-new;
$ua-agent(MyApp/0.1 );
my $req = HTTP::Request-new(GET = 'http://www.msn.com');
 $req-content_type('text/plain');
 $req-content('match=wwwerrors=0');
 my $res = $ua-request($req);

 if ($res-is_success)
 {
  print Success.\n;
 }
 else
{
  print Failed: , $res-status_line, \n;
}

please help

OS Sun solaris 8.
Apache 1.3.26
mod_perl 1.27
perl 5.005_03

Regards
  Harsha Yale





--

This e-mail may contain confidential and/or privileged information. If you are not the 
intended recipient (or have received this e-mail in error) please notify the sender 
immediately and destroy this e-mail. Any unauthorized copying, disclosure or 
distribution of the material in this e-mail is strictly forbidden.





reading cookies from mod_perl HTTP request handlers

2003-01-15 Thread Vishal Verma
Hi,

I'm unable to access/read cookies from incoming HTTP requests using
mod_perl HTTP request handlers.Here's what my relevant apache config
section looks like

Location /
PerlHeaderParserHandler MyModule::header_parse_handler
/Location


My browser already has a cookie named 'foo' with value 'bar' with path 
expire times set appropriately. Here's what my browser GET request looks
like


GET /cgi-bin/login HTTP/1.1
Host: xx
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.2.1)
Gecko/20021130
Accept:
text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,video/x-mng,image/png,image/jpeg,image/gif;q=0.2,text/css,*/*;q=0.1
Accept-Language: en-us, en;q=0.50
Accept-Encoding: gzip, deflate, compress;q=0.9
Accept-Charset: ISO-8859-1, utf-8;q=0.66, *;q=0.66
Keep-Alive: 300
Connection: keep-alive
Cookie: foo=bar

The last line confirms that the browser is sending the cookie.

But, I'm not able to see that cookie when I print $ENV{'HTTP_COOKIE'}
within in header_parse_handler. mod_perl docs say that that you can
examine request headers in the PerlHeaderParserHandler. Am I missing
something? Am I using a wrong handler? Please help.

thanks
-vish




Re: reading cookies from mod_perl HTTP request handlers

2003-01-15 Thread Geoffrey Young



But, I'm not able to see that cookie when I print $ENV{'HTTP_COOKIE'}
within in header_parse_handler. 

%ENV is setup during the fixup phase, so it hasn't been populated yet.


mod_perl docs say that that you can
examine request headers in the PerlHeaderParserHandler.


yes, using something like $r-headers_in.  you're actually better off using 
Apache::Cookie, which is in the libapreq distribution on cpan.

if you want to force %ENV to be setup earlier, try calling

$r-subprocess_env;

in a void context before checking %ENV - it used to segfault for me, but the 
docs says it should work.

HTH

--Geoff



Re: reading cookies from mod_perl HTTP request handlers

2003-01-15 Thread Vishal Verma
On Wed, 2003-01-15 at 14:24, Geoffrey Young wrote:

 if you want to force %ENV to be setup earlier, try calling
 
 $r-subprocess_env;
 
 in a void context before checking %ENV - it used to segfault for me, but the 
 docs says it should work.
 

This worked for me! Thanks a million!

-vish




HTTP

2003-01-11 Thread Beau E. Cox
Hi -

I am learning Apache/pod_perl with the 'Eagle' book,
and am writing a mini HTTP server in perl as a learning
aid. Everthing was going peachy untill I got to Authorization.

Thru my browser (I've used MS IE and Netscape on W2K and various Linux
browsers) I request a page served by my server; the request lookes like
(from my log):

-  GET / HTTP/1.1
-  Accept: */*
-  Accept-Language: en-us
-  Accept-Encoding: gzip, deflate
-  User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0; .NET CLR
1.0.3705)
-  Host: 127.0.0.1:8223
-  Connection: Keep-Alive

I serve the following:

-  HTTP/1.1 401 Authorization Required
-  Date: Sat Jan 11 02:54:52 2003
-  Server: BC_HTTP_PlayServer/0.0.1 (MSWin32)
-  WWW-Authenticate: Basic realm=WallyWorld
-  Content-Length: 167
-  Content-Type: text/html
-
-  html
-  head
-  titleAuthorization Required/title
-  /head
-  body
-  h1Jeff, the instructional HTTP Server/h1
-  hr width=100%/hr
-  p
-  Wrong!
-  /p
-  /body
-  /html

All browsers seem to ignore the

-  WWW-Authenticate: Basic realm=WallyWorld

header line and just display the page instead of popping
up the Authorization window and sending back the Authorize
header line. If I put Authorization
into a page served by Apache it works (of cource).

I've read the RFP's again and again and I can't see what I'm missing.
Can any HTTP experts help?

Aloha = Beau.

PS: Sorry if this seems OT, but it is in my mod_perl learning path!





[modperl 1.27] Problem regarding Perl*Handler and the HTTP request line

2002-12-24 Thread corn



Hi all

Apologise if I have posted to the wrong list.
I am quite new to the Perl*Handlers. Iam wondering if it is 
possibleto write a handler which parses the very first header line, 
say..

 telnet localhost http
Trying 127.0.0.1
Connected to localhost.
Escape character is '^]'

[C] THIS_IS_NOT_A_USUAL_HTTP_HEADER\r\n
[S] YOUR_REQUEST_IS_UNDERSTOOD,_SEND_MORE_DETAIL\r\n
[C] Other: non-usual-http-headers-here\r\n
[S] balbalbal..closing connection\r\n
Connection closed by remote host.

where the line followed by [C] stands for the line sent by the client, and 
the line followed by [S] is the line sent by some perl module.

I would like to have the other GET/POST requests being processed like any 
other server does (i.e. if the request line is a usual GET/POST line, the 
handler bypasses it, otherwise the handler will make apache 'sends' the request 
to another perl module that knows how to handle this request).

I would like to know if this is possible? Thanks in advance.


Michael


[ANNOUNCE] HTTP-WebTest 2.00

2002-12-14 Thread Ilya Martynov
The URL

http://martynov.org/tgz/HTTP-WebTest-2.00.tar.gz

has entered CPAN as

  file: $CPAN/authors/id/I/IL/ILYAM/HTTP-WebTest-2.00.tar.gz
  size: 85858 bytes
   md5: e93464263f7cd321c8b43fa7c73604e0

NAME
HTTP::WebTest - Testing static and dynamic web content

DESCRIPTION

This module runs tests on remote URLs containing
Perl/JSP/HTML/JavaScript/etc. and generates a detailed test report. This
module can be used as-is or its functionality can be extended using
plugins. Plugins can define test types and provide additional report
capabilities. This module comes with a set of default plugins but can be
easily extended with third party plugins.



MAJOR CHANGES SINCE LAST STABLE VERSION 1.07:

* This is full rewrite which features modular easily expendable
  architecture: new test types can be added with plugin modules.

* Support for Test::Harness style testing have been added.

* Many new standart test types have been added.

* Support for local file test mode have been removed from this version
  and will be implemented in plugin which will be released separately.

* Licensing terms have been changed from Artistic only and now
  HTTP-WebTest is dual licensed under Artistic/GPL.

* Many other changes I forgot to mention :)



CHANGES SINCE LAST BETA VERSION 1.99_09:

ENHANCEMENTS:

* Allow plugins to insert tests into the test queue during test
sequence runtime.  Inspired by Paul Hamingson's patch.

* New core test parameter 'relative_urls' which enables HTTP-WebTest
to use relative URLs in tests.

* New core test parameter 'timeout' which allows to control user agent
timeout settings while running test sequence.

* Moved self-test suite support code into module
HTTP::WebTest::SelfTest to allow reusing it in self-test suites for
plugins maintained outside of HTTP-WebTest.

INCOMPATIBILITIES:

* HTTP::WebTest::Plugin::Apache plugin have been removed from
HTTP::WebTest and will be released as independent CPAN module.  It
will no longer be loaded by default even if it is available.

* Renamed all last_xxx methods to current_xxx since the latest naming
schema is less confusing.

* HTTP::WebTest::Plugin::HarnessReport is rewritten using
Test::Builder. As side effect now you can freely intermix
HTTP::WebTest based tests with tests written using other testing
libraries like Test::More or Test::Differences. Unfortunately this
change breaks existing test scripts which were using
HTTP::WebTest::Plugin::HarnessReport because now number of tests in
test scripts should be declared explictly with 'use Test::More plan =
NN' or 'use Test::More qw(no_plan)'.

BUG FIXES:

* Fixed some minor documentation bugs.  Thanks to William McKee.

* Allow to use $webtest-last_xxx method calls after running test
sequence with $webtest-run_tests.  Thanks to Kevin Baker for patch.


-- 
Ilya Martynov,  [EMAIL PROTECTED]
CTO IPonWEB (UK) Ltd
Quality Perl Programming and Unix Support
UK managed @ offshore prices - http://www.iponweb.net
Personal website - http://martynov.org




Outdated link at http://perl.apache.org/products/products.html

2002-11-26 Thread Philip Mak
I couldn't find a contact address on the modperl website, so I'm
posting this here...

On http://perl.apache.org/products/products.html there is an outdated
link to mwForum. The new URL is: http://www.mwforum.org/



Re: Outdated link at http://perl.apache.org/products/products.html

2002-11-26 Thread Per Einar Ellefsen
At 11:35 26.11.2002, Philip Mak wrote:

I couldn't find a contact address on the modperl website, so I'm
posting this here...

On http://perl.apache.org/products/products.html there is an outdated
link to mwForum. The new URL is: http://www.mwforum.org/


Thank you Philip, it has been corrected.


--
Per Einar Ellefsen
[EMAIL PROTECTED]





Re: use http-equiv to refresh the page

2002-11-12 Thread Juha-Mikko Ahonen
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On Wednesday 06 November 2002 15:19, Eric L. Brine wrote:
 HTML 4.01 also has a section on META and http-requiv. However, the
 only reference to refresh is: Note. Some user agents support the
 use of META to refresh the current page after a specified number of
 seconds, with the option of replacing it by a different URI. Authors
 should __not__ use this technique to forward users to different
 pages, as this makes the page inaccessible to some users. Instead,
 automatic page forwarding should be done using server-side
 redirects.

All the HTTP-EQUIV meta tags are relics from static HTML pages. The HTTP 
server (eg. Apache) reads meta tags from HTML pages and appends all 
meta tags with HTTP-EQUIV to outgoing HTTP headers. This feature is not 
available for dynamic content; dynamic pages must set their own 
headers.

 I'm guessing this is because http-equiv is designed to hold an HTTP
 header, but there is no such thing as an Refresh header in HTTP.

There is a Refresh header in HTTP. It's syntax is the same as for the 
HTTP-EQUIV meta tag.

 So http-equiv=refresh is no longer standard. Of course, this is all
 theoretical. In practice, too many people are not easily swayed by a
 measily thing such as a standard.

UAs are not required to deal with meta tags which contain HTTP-EQUIV 
content. Both Internet Explorer (at least 6, v6+ may also support them 
in quirk mode) and Netscape/Mozilla should interpret them if the server 
does not include them in the real HTTP headers. Some UAs don't do this, 
eg. Konqueror, and support for them might go away any time without 
notice, so it should not be used.

The CGI module enables the programmer to add arbitrary HTTP headers with 
the syntax

header(
-type = 'text/html',
-refresh = '0; url=http://www.address.com/'
);

See 'perldoc CGI' for more information.
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org

iD8DBQE90OTjnksV4Ys/z5gRAhTxAJ9tv49KSvNt0JRbzf2Uws+wiOIM4ACfdzDX
zhojTxLkGITTQT8MkAVACIg=
=fQe4
-END PGP SIGNATURE-




Re: use http-equiv to refresh the page

2002-11-07 Thread mike808
 No, that's server push you're thinking of.  NPH (non-parsed header) 
 scripts are CGI scripts that talk directly to the client without the 
 server parsing headers and adding others (like the one that says it's 
 Apache).

My bad. It was. But I think one needs to use NPH scripts to generate
server push documents, IIRC. Which is why I was thinking about it.

 Normally, mod_cgi adds the response line and certain other 
 headers, so it parses your output.  This is the same as using mod_perl 
 with the PerlSendHeader option on.  NPH script behave like mod_perl with 
 PerlSendHeader off.

Trust me, you want to leave all the VMS, EBCDIC, and MSIE weirdness
related to sending headers in the right order, checking values, including
extras for broken browsers, and the all-important CRLF-CRLF header
separator, et al. to CGI.pm.

Mike808/

-
http://www.valuenet.net





Re: [OT] use http-equiv to refresh the page

2002-11-06 Thread fliptop
On Tue, 5 Nov 2002 at 22:52, Chris Shiflett opined:

[snip]
CS:The W3C's stance on refresh is the same for the header as well as the 
CS:meta tag: they did not originally intend for it to be used to specify a 
CS:*different* URL as a rudimentary method of redirection. They meant it to 
CS:be used to refresh the current resource only. However, this rogue 
CS:practice of redirection is quite common with both the header and the 
CS:meta tag and is very well supported by browsers. In fact, I am not aware 
CS:of any Web client that supports refresh but also limits the URL to the 
CS:current resource only.

i was bitten by this assumption recently.  case in point:

i needed to develop a way to display several images as a slideshow using
plain html files.  i would glob the images, and in each html file i
inserted a meta refresh that would load the next image in the series after
a 7 second delay.  since the html files were eventually going to be burned
to a cd, i had to point to each new file as such:

meta http-equiv=refresh content=7;file02.html

because i cannot always assume to know the user's cd-rom drive
designation.  this worked fine in netscape and mozilla, but did not work
at all in internet explorer versions previous to 5.5.  in older versions
of ie, it simply refreshed the current page after the 7 second delay, no
matter what was put after the semicolon in the content attribute.  so i
had to include instructions for the users that if they used internet
explorer, they must upgrade to at least version 5.5 for the slideshow to
work.  of course, i had tested the app on ie 5.5, so i didn't discover
this myself until a user contact me and complained the slideshow wasn't
working.

and you'd be surprised how many old versions of ie are being used out 
there.




RE: use http-equiv to refresh the page

2002-11-06 Thread Eric L. Brine

 I just wanted to mention that the meta tag as well as its http-equiv
 attribute are both official parts of the HTML standard and have been for
 quite some time.

Yes and no.

HTML 4.0 has a section on META and http-requiv. In it, it mentions that
Some user agents support the use of META to refresh the current page after
a specified number of seconds, with the option of replacing it by a
different URI. and proceeds with an example. That sounds more advisory than
part of the standard. But for the sake of argument, let's say it's part of
the standard, and check what HTML 4.01 has to say.

HTML 4.01 also has a section on META and http-requiv. However, the only
reference to refresh is: Note. Some user agents support the use of META
to refresh the current page after a specified number of seconds, with the
option of replacing it by a different URI. Authors should __not__ use this
technique to forward users to different pages, as this makes the page
inaccessible to some users. Instead, automatic page forwarding should be
done using server-side redirects.

I'm guessing this is because http-equiv is designed to hold an HTTP header,
but there is no such thing as an Refresh header in HTTP.

So http-equiv=refresh is no longer standard. Of course, this is all
theoretical. In practice, too many people are not easily swayed by a measily
thing such as a standard.

--
Eric L. Brine   | ICQ: 4629314
[EMAIL PROTECTED] | MSN: [EMAIL PROTECTED]
http://www.adaelis.com/ | AIM: ikegamiii




Re: use http-equiv to refresh the page

2002-11-06 Thread Mithun Bhattacharya

--- Perrin Harkins [EMAIL PROTECTED] wrote:

 I might be overzealous about this, but I dislike seeing HTTP-EQUIV
 meta 
 tags used when actual HTTP headers are available to do the same
 thing. 
  It's fine if there's a reason for it, but usually people do it
 because 
 they don't realize they can just send a real header instead..


So what is the recommended way of doing wait pages ?? Sending a 302
wont definitely show the user anything other than all that text
changing in the status bar.



Mithun

__
Do you Yahoo!?
HotJobs - Search new jobs daily now
http://hotjobs.yahoo.com/



RE: [OT] use http-equiv to refresh the page

2002-11-06 Thread Alessandro Forghieri
Greetings.

[...]
 [snip]
 CS:The W3C's stance on refresh is the same for the header as 
 well as the 
 CS:meta tag: they did not originally intend for it to be used 
 to specify a 
 CS:*different* URL as a rudimentary method of redirection. 
[...]
 i was bitten by this assumption recently.  case in point:
 
 i needed to develop a way to display several images as a 
 slideshow using
 plain html files.
[..horror story clipped]

But, in fact, redirects - either implicit or explicit - have many ways of
biting the unwary (and curiously, or perhaps not, IE always a key role).

Consider MS KB Article Q160013:

If a CGI application returns a MIME Type that is unknown or not associated
to Internet Explorer internally, Internet Explorer makes two POST requests
to the server.

What this means is that (for instance), sending a PDF file as the result of
a POST request
may cause the following sequence of events:

1) the file is downloaded
2) it is removed from the disk cache as the second POST request goes out
3) Acroread is launched and then says No such file.

this bug is active on many, many versions of IE. It happens if you either
send the file directly OR issue a redirect. 

The only workaround I could find was a meta-http-refresh. And then I found
out that using '0' as a refresh time won't work on Mozilla (who tries to
refressh the *current* page every 0 seconds and gets stuck in a loop- not
nice).

So what's a poor programmer to do, caught between standards and arguably
buggy browsers?

Cheers,
alf

P.S. Anybody knows of a better solution to Q160013, I'd like very much to
hear about it...TIA.



RE: use http-equiv to refresh the page

2002-11-06 Thread Chris Shiflett

  I just wanted to mention that the meta tag as well as its http-equiv
  attribute are both official parts of the HTML standard and have been
  for quite some time.

 Yes and no.

Well, I disagree with the no. I will explain it again below.

 HTML 4.0 has a section on META and http-requiv. In it, it mentions that
 \Some user agents support the use of META to refresh the current page
 after a specified number of seconds, with the option of replacing it by a
 different URI.\ and proceeds with an example. That sounds more advisory
 than part of the standard. But for the sake of argument, let\'s say it\'s part
 of the standard, and check what HTML 4.01 has to say.

 HTML 4.01 also has a section on META and http-requiv. However, the only
 reference to \refresh\ is: \Note. Some user agents support the use of
 META to refresh the current page after a specified number of seconds, with
 the option of replacing it by a different URI. Authors should __not__ use
 this technique to forward users to different pages, as this makes the page
 inaccessible to some users. Instead, automatic page forwarding should
 be done using server-side redirects.\

 I\'m guessing this is because http-equiv is designed to hold an HTTP
 header, but there is no such thing as an \Refresh\ header in HTTP.

No, there is an HTTP header called Refresh, and it is standard. The meta tag and the 
http-equiv attribute of the meta tag are also standard. However, some people seem to 
be confusing HTTP and HTML here for some reason. Refresh is an HTTP standard, while 
the meta tag is HTML. The http-equiv attribute of the meta tag allows some HTTP 
headers to be specified in the HTML. While this feature offers little to mod_perl 
developers who can manipulate the headers themselves anyway, it was historically very 
helpful to developers for providing accurate HTTP headers such as Expires when they 
could not otherwise do this.

The reason for that warning in the HTML specification is due to what the W3C likely 
considers a rampant abuse of the Refresh header which was not intended for redirection 
but only for refreshing the current resource. They are not warning against Refresh 
alone but rather what they consider a misuse of Refresh. The key phrase is, \with the 
option of replacing it by a different URI.\ This is what is frowned upon, not the 
meta HTML tag nor the Refresh HTTP header.

 So http-equiv=\refresh\ is no longer standard. Of course, this is all
 theoretical. In practice, too many people are not easily swayed by a
 measily thing such as a standard.

Right, and this was my second point in an earlier message. Support for this rogue 
feature is pretty widespread, though it should not be completely trusted. As one of 
the other posters pointed out, there are Web clients that do not support the use of a 
meta tag for redirection, but many (possibly most) do. It is quite common to see the 
use of a meta tag for redirection accompanied by instructions on the screen and a link 
for users that are not automatically redirected. By accomodating the users who are not 
automatically redirected, you can eliminate the possibility of a dead-end.

Of course, I hope that mod_perl developers always choose manipulating the real HTTP 
headers over the use of the http-equiv attribute of the meta tag. Also, it seems 
possible that there might be much wider support for redirection with the real Refresh 
HTTP header than for the meta tag equivalent. I know of at least one attempt to test 
and document support for this specific use:

http://www.hixie.ch/tests/evil/mixed/refresh1.http.html

Perhaps the results of this test can help a developer determine whether this misuse of 
the Refresh header is appropriate for a certain situation.

Chris



Re: use http-equiv to refresh the page

2002-11-06 Thread mike808
On the use of META REFRESH tags, Chris wrote:
 It is also the only option for the pause, then redirect behavior the 
 original poster desired that I can think of.

I also seem to recall reading in the HTTP spec (and in Lincoln's CGI.pm code)
that the use of a Redirect header in response to a POST request was
specifically verboten. But, as was noted, everyone does it anyway and it works.

Weiqi really needs to look at his apache logs, try running his CGI from
the command line to see the exact output the browser sees. Of course,
he'll have to manually perform the redirect :=)

Use the lwp-request tools (GET and POST) to get live interaction with the
webserver in question if running the CGI is not possible.

Also, NPH is only implemented in the NS browsers, and was a way for a webserver
to send multiple documents inline down to a browser, and was an ancient way
to write status pages and such that automagically refreshed themselves.
It was originally used as a primitive way to animate images. IE will display
these pages as a single never-ending document separated by the headers.

If you're running a long-running CGI and need the browser to keep
the connection alive, you need to periodically (2 minutes for IE, but is 
browser specific) send something to the browser - like a waiting  page
that dribbles out whitespace until the document is ready. There are more
complicated ways to do this as well, and that technique is dated to modern
web users.

In any case, I don't think the issue is a mod_perl one, but rather a
CGI.pm one.

BTW, Weiqi - there is a stlouis.pm perlmongers list.

Mike808/

-
http://www.valuenet.net





Re: use http-equiv to refresh the page

2002-11-06 Thread Perrin Harkins
[EMAIL PROTECTED] wrote:


Also, NPH is only implemented in the NS browsers, and was a way for a webserver
to send multiple documents inline down to a browser, and was an ancient way
to write status pages and such that automagically refreshed themselves.



No, that's server push you're thinking of.  NPH (non-parsed header) 
scripts are CGI scripts that talk directly to the client without the 
server parsing headers and adding others (like the one that says it's 
Apache). Normally, mod_cgi adds the response line and certain other 
headers, so it parses your output.  This is the same as using mod_perl 
with the PerlSendHeader option on.  NPH script behave like mod_perl with 
PerlSendHeader off.

- Perrin



use http-equiv to refresh the page

2002-11-05 Thread Wei Gao



Hi, this is not a mod_perl question, but rather a 
question to see if I have to use mod_perl to achive this.

In my perl program executing in Apache web server, I 
have the following code:

use CGI ;

$query = new CGI ;
$url = "<A" 
href="http://www.mycite.com">http://www.mycite.com 
; #The url to refresh.print 
$query-header(-status='200 Ok', -type='text/html');print 
"htmlheadmeta http-equiv=\"Refresh\" content=\"0;URL="$url\"" 
//head/html"; 

However, when I tried to display it in Internet 
Explorer, I got an empty page, instead of being redirected to the URL I 
specified. The same code works fine with ActiveState in IIS. Is this an issue 
with Apache? How can I make this work? By using Mod_PERL? I am currently using 
Perl 5.6.1.

Thanks for any comments.

Wei Gao




Re: use http-equiv to refresh the page

2002-11-05 Thread Perrin Harkins
Wei Gao wrote:


In my perl program executing in Apache web server, I have the 
following code:
 
use CGI ;
 
$query = new CGI ;
$url = http://www.mycite.com ;  #The url to refresh.
 
 print $query-header(-status='200 Ok', -type='text/html');
 print htmlheadmeta http-equiv=\Refresh\ 
content=\0;URL=$url\ //head/html;


Uh, that's not a redirect; that's an ugly proprietary hack.  You should 
be using standard HTTP redirects.  See 
http://search.cpan.org/author/JHI/perl-5.8.0/lib/CGI.pm#GENERATING_A_REDIRECTION_HEADER 
for more.

- Perrin



Re: use http-equiv to refresh the page

2002-11-05 Thread Wei Gao



Thanks.

I have tried "print 
$query-redirect('http://somewhere.else/in/movie/land') ;" before, which 
works fine as to redirect the user to the web page. However, if the user then 
tries to refresh this page, the CGI script is called again without any params, 
which result in "Internal Server Error". So, the goal I want to achieve is that 
the user can refresh the page I returned without getting an error. It should 
refresh the web page, not calling my CGI script again.

I also tried "print 
$query-redirect(-uri='http://somewhere.else/in/movie/land', 
-nph=1);" moments ago, which generated an "Internal Server Error" in IE 
window.Is using meta tag a "bad" 
approach? I thought this is a way to solve my situation here.

Wei

- Original Message - 

  From: 
  Perrin Harkins 
  
  To: Wei Gao 
  Cc: [EMAIL PROTECTED] 
  Sent: Tuesday, November 05, 2002 2:50 
  PM
  Subject: Re: use http-equiv to refresh 
  the page
  Wei Gao wrote: In my perl program executing in 
  Apache web server, I have the  following code:  
  use CGI ;  $query = new CGI ; $url = "<A" 
  href="http://www.mycite.com">http://www.mycite.com 
  ; #The url to refresh. 
   print $query-header(-status='200 Ok', 
  -type='text/html'); print "htmlheadmeta 
  http-equiv=\"Refresh\"  content=\"0;URL="$url\"" 
  //head/html";Uh, that's not a redirect; that's 
  an ugly proprietary hack. You should be using standard HTTP 
  redirects. See http://search.cpan.org/author/JHI/perl-5.8.0/lib/CGI.pm#GENERATING_A_REDIRECTION_HEADER 
  for more.- Perrin


Re: use http-equiv to refresh the page

2002-11-05 Thread wsheldah

Any time you see an Internal Server Error, you should be looking in your
apache server's error_log file to see what exactly the error was. That will
help you (and the list) figure out what's going wrong.

Wes




Wei Gao [EMAIL PROTECTED] on 11/05/2002 06:10:34 PM

To:Perrin Harkins [EMAIL PROTECTED]
cc:[EMAIL PROTECTED]
Subject:Re: use http-equiv to refresh the page


Thanks.

I have tried print $query-redirect('http://somewhere.else/in/movie/land')
; before, which works fine as to redirect the user to the web page.
However, if the user then tries to refresh this page, the CGI script is
called again without any params, which result in Internal Server Error.
So, the goal I want to achieve is that the user can refresh the page I
returned without getting an error. It should refresh the web page, not
calling my CGI script again.

I also tried print $query-redirect(-uri
='http://somewhere.else/in/movie/land', -nph=1); moments ago, which
generated an Internal Server Error in IE window.

Is using meta tag a bad approach? I thought this is a way to solve my
situation here.

Wei

- Original Message -
  From: Perrin Harkins
  To: Wei Gao
  Cc: [EMAIL PROTECTED]
  Sent: Tuesday, November 05, 2002 2:50 PM
  Subject: Re: use http-equiv to refresh the page


  Wei Gao wrote:

   In my perl program executing in Apache web server, I have the
   following code:
  
   use CGI ;
  
   $query = new CGI ;
   $url = http://www.mycite.com ;  #The url to refresh.
  
print $query-header(-status='200 Ok', -type='text/html');
print htmlheadmeta http-equiv=\Refresh\
   content=\0;URL=$url\ //head/html;


  Uh, that's not a redirect; that's an ugly proprietary hack.  You should
  be using standard HTTP redirects.  See
  
http://search.cpan.org/author/JHI/perl-5.8.0/lib/CGI.pm#GENERATING_A_REDIRECTION_HEADER

  for more.

  - Perrin


(See attached file: C.htm)







Thanks.

I have tried "print 
$query-redirect('http://somewhere.else/in/movie/land') ;" before, which 
works fine as to redirect the user to the web page. However, if the user then 
tries to refresh this page, the CGI script is called again without any params, 
which result in "Internal Server Error". So, the goal I want to achieve is that 
the user can refresh the page I returned without getting an error. It should 
refresh the web page, not calling my CGI script again.

I also tried "print 
$query-redirect(-uri='http://somewhere.else/in/movie/land', 
-nph=1);" moments ago, which generated an "Internal Server Error" in IE 
window.Is using meta tag a "bad" 
approach? I thought this is a way to solve my situation here.

Wei

- Original Message - 

  From: 
  Perrin Harkins 
  
  To: Wei Gao 
  Cc: [EMAIL PROTECTED] 
  Sent: Tuesday, November 05, 2002 2:50 
  PM
  Subject: Re: use http-equiv to refresh 
  the page
  Wei Gao wrote: In my perl program executing in 
  Apache web server, I have the  following code:  
  use CGI ;  $query = new CGI ; $url = "<A" 
  href="http://www.mycite.com">http://www.mycite.com 
  ; #The url to refresh. 
   print $query-header(-status='200 Ok', 
  -type='text/html'); print "htmlheadmeta 
  http-equiv=\"Refresh\"  content=\"0;URL="$url\"" 
  //head/html";Uh, that's not a redirect; that's 
  an ugly proprietary hack. You should be using standard HTTP 
  redirects. See http://search.cpan.org/author/JHI/perl-5.8.0/lib/CGI.pm#GENERATING_A_REDIRECTION_HEADER 
  for more.- Perrin


Re: use http-equiv to refresh the page

2002-11-05 Thread Perrin Harkins
Wei Gao wrote:


I have tried print 
$query-redirect('http://somewhere.else/in/movie/land') ; before, 
which works fine as to redirect the user to the web page. However, if 
the user then tries to refresh this page, the CGI script is called 
again without any params, which result in Internal Server Error.


You lost me.  If you redirect the user to http://mycite.com/, and then 
the user reloads, he should be reloading http://mycite.com/.  I don't 
see any reason why that wouldn't work.  Are you saying that reload in IE 
goes back to the URL that issued the redirect and reloads that?  Even if 
it does, it should still be submitting the query string or POST data, 
although the user may get a pop-up asking if he wants to submit POST 
data again.

Is using meta tag a bad approach?



Yes.  It's something that Netscape added to their browser, which others 
may or may not add to their browsers.  It's not part of any HTTP spec 
and isn't guaranteed to work, even on totally correct web browsers.

- Perrin




Re: use http-equiv to refresh the page

2002-11-05 Thread Wei Gao



Thanks for the reminder. I think the reason that "print 
$query-redirect(-uri='http://www.mysite.com', -nph=1);" is not 
working, is because my program doesn't seem to know how to handle "nph". I am 
using Apach1.3.26 and Perl 5.6.1. I have 
"use CGI qw(:standard -nph) ;" at the beginning of the script. However, 
when I tried to use nph, the server complains about "Bad Header". 

Is there any known issues that the versions I use don't 
support nph? Am I missing something?

Thanks.

Wei 

  - Original Message - 
  From: 
  [EMAIL PROTECTED] 
  To: Wei Gao 
  Cc: [EMAIL PROTECTED] 
  Sent: Tuesday, November 05, 2002 3:19 
  PM
  Subject: Re: use http-equiv to refresh 
  the page
  Any time you see an Internal Server Error, you should be 
  looking in yourapache server's error_log file to see what exactly the 
  error was. That willhelp you (and the list) figure out what's going 
  wrong.Wes"Wei Gao" [EMAIL PROTECTED] on 
  11/05/2002 06:10:34 PMTo: "Perrin Harkins" [EMAIL PROTECTED]cc: 
  [EMAIL PROTECTED]Subject: 
  Re: use http-equiv to refresh the pageThanks.I have tried 
  "print $query-redirect('http://somewhere.else/in/movie/land');" 
  before, which works fine as to redirect the user to the web page.However, 
  if the user then tries to refresh this page, the CGI script iscalled again 
  without any params, which result in "Internal Server Error".So, the goal I 
  want to achieve is that the user can refresh the page Ireturned without 
  getting an error. It should refresh the web page, notcalling my CGI script 
  again.I also tried "print 
  $query-redirect(-uri='http://somewhere.else/in/movie/land', 
  -nph=1);" moments ago, whichgenerated an "Internal Server Error" in IE 
  window.Is using meta tag a "bad" approach? I thought this is a 
  way to solve mysituation here.Wei- Original Message 
  - From: Perrin Harkins To: Wei Gao Cc: 
  [EMAIL PROTECTED] Sent: Tuesday, November 05, 2002 2:50 
  PM Subject: Re: use http-equiv to refresh the page 
  Wei Gao wrote:  In my perl program executing in Apache web 
  server, I have the  following code:  
   use CGI ;   $query = new CGI ;  
  $url = "http://www.mycite.com" ; #The url to 
  refresh.   print 
  $query-header(-status='200 Ok', -type='text/html'); 
   print "htmlheadmeta 
  http-equiv=\"Refresh\"  content=\"0;URL="$url\"" 
  //head/html"; Uh, that's not a redirect; 
  that's an ugly proprietary hack. You should be using standard 
  HTTP redirects. See 
  http://search.cpan.org/author/JHI/perl-5.8.0/lib/CGI.pm#GENERATING_A_REDIRECTION_HEADER 
  for more. - Perrin(See attached file: 
  C.htm)


Re: use http-equiv to refresh the page

2002-11-05 Thread Chris Shiflett

  Is using meta tag a \bad\ approach?

 Yes.  It\'s something that Netscape added to their browser, which others 
 may or may not add to their browsers.  It\'s not part of any HTTP spec 
 and isn\'t guaranteed to work, even on totally correct web browsers.

A meta tag is not something unique to Netscape nor the least bit uncommon. It is 
supported by all major Web browsers and has been for quite some time. While its use 
may be discouraged over a protocol-level redirect at times, it is appropriate for some 
situations and appears to be appropriate for what the original poster is trying to 
accomplish.

As with any other HTML tag, the meta tag does not need to be part of an HTTP 
specification in order to be valid. Also, it is guaranteed to work on any compliant 
Web browser. HTML has its own specification, and the latest version describes the meta 
tag here:

http://www.w3.org/TR/html4/struct/global.html#h-7.4.4.2

Sorry for disagreeing like this, but I am always afraid to see people being given 
incorrect information that might cause them difficulty. I hope this helps clarify.

Chris



Re: use http-equiv to refresh the page

2002-11-05 Thread Perrin Harkins
Chris Shiflett wrote:


A meta tag is not something unique to Netscape



I said it was added by Netscape, and I'm pretty sure it was, back in 1.1 
or 2.0.

As with any other HTML tag, the meta tag does not need to be part of an HTTP specification in order to be valid. Also, it is guaranteed to work on any compliant Web browser. HTML has its own specification, and the latest version describes the meta tag here:



http://www.w3.org/TR/html4/struct/global.html#h-7.4.4.2



Look a little further down that page:

/*Note.* Some user agents support the use of META 
http://www.w3.org/TR/html4/struct/global.html#edef-META to refresh the 
current page after a specified number of seconds, with the option of 
replacing it by a different URI. Authors should *not* use this technique 
to forward users to different pages, as this makes the page inaccessible 
to some users. Instead, automatic page forwarding should be done using 
server-side redirects./

I might be overzealous about this, but I dislike seeing HTTP-EQUIV meta 
tags used when actual HTTP headers are available to do the same thing. 
It's fine if there's a reason for it, but usually people do it because 
they don't realize they can just send a real header instead..

- Perrin



Re: use http-equiv to refresh the page

2002-11-05 Thread Chris Shiflett
Perrin Harkins wrote:


Chris Shiflett wrote:


http://www.w3.org/TR/html4/struct/global.html#h-7.4.4.2


Look a little further down that page:

/*Note.* Some user agents support the use of META 
http://www.w3.org/TR/html4/struct/global.html#edef-META to refresh 
the current page after a specified number of seconds, with the option 
of replacing it by a different URI. Authors should *not* use this 
technique to forward users to different pages, as this makes the page 
inaccessible to some users. Instead, automatic page forwarding should 
be done using server-side redirects./

I might be overzealous about this, but I dislike seeing HTTP-EQUIV 
meta tags used when actual HTTP headers are available to do the same 
thing. It's fine if there's a reason for it, but usually people do it 
because they don't realize they can just send a real header instead..


No, I actually agree with you completely on that last bit and am of the 
opinion that using the http-equiv attribute is a leftover habit from the 
early days of Web development when manipulating HTTP headers was not as 
convenient as it is now.

I just wanted to mention that the meta tag as well as its http-equiv 
attribute are both official parts of the HTML standard and have been for 
quite some time. Netscape also introduced things like cookies and SSL, 
but that should in no way discredit the technology.

The W3C's stance on refresh is the same for the header as well as the 
meta tag: they did not originally intend for it to be used to specify a 
*different* URL as a rudimentary method of redirection. They meant it to 
be used to refresh the current resource only. However, this rogue 
practice of redirection is quite common with both the header and the 
meta tag and is very well supported by browsers. In fact, I am not aware 
of any Web client that supports refresh but also limits the URL to the 
current resource only.

It is also the only option for the pause, then redirect behavior the 
original poster desired that I can think of.

Chris



Re: use http-equiv to refresh the page

2002-11-05 Thread Perrin Harkins
Wei Gao wrote:


Thanks for the reminder.  I think the reason that print 
$query-redirect(-uri='http://www.mysite.com', -nph=1); is not 
working, is because my program doesn't seem to know how to handle 
nph. I am using Apach1.3.26 and Perl 5.6.1. I have
use CGI qw(:standard -nph) ; at the beginning of the script. 
However, when I tried to use nph, the server complains about Bad 
Header.
 
Is there any known issues that the versions I use don't support nph? 
Am I missing something?


I don't think NPH is related to the problem you're having, but Apache 
determines if a script is NPH by looking at the prefix of the file.  Try 
naming your script nph-something.cgi and it should support NPH.  This 
is not very well documented, unfortunately.

You don't need to use an NPH script to make redirection work.  It's also 
still not clear to me what isn't working for you.  When I get redirected 
in IE and then reload the page, it reloads the page I was redirected to.

Since this is all getting pretty far off topic for the mod_perl list, 
you might want to try asking on a CGI-specific list or on 
http://perlmonks.org/.

- Perrin



Re: use http-equiv to refresh the page

2002-11-05 Thread Perrin Harkins
Chris Shiflett wrote:


I just wanted to mention that the meta tag as well as its http-equiv 
attribute are both official parts of the HTML standard and have been 
for quite some time. Netscape also introduced things like cookies and 
SSL, but that should in no way discredit the technology.


I'm just bitter about Netscape because I worked at a company that made 
me use frames and JavaScript when the 2.0 version came out.

It is also the only option for the pause, then redirect behavior the 
original poster desired that I can think of..


It is the only way I know to do that, but I didn't think that's what he 
was trying to do.  He had a wait time of 0 in his example.

- Perrin



[newbie] How do I send a custom HTTP::Response?

2002-10-23 Thread Chris Pizzo

The documentation tells me how to create a new response object but how do I
reply to a request using my custom response?

Thanks,
Chris




Re: [newbie] How do I send a custom HTTP::Response?

2002-10-23 Thread Perrin Harkins
Chris Pizzo wrote:

The documentation tells me how to create a new response object but how do I
reply to a request using my custom response?


HTTP::Response?  That's an LWP thing, not a mod_perl thing.  Maybe 
you're a little confused here?  Tell us what you're trying to do.

- Perrin



Re: [newbie] How do I send a custom HTTP::Response?

2002-10-23 Thread Chris Pizzo
OK,  I am getting a request from a server and I need to respond with an XML
doc.


- Original Message -
From: Perrin Harkins [EMAIL PROTECTED]
To: Chris Pizzo [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Wednesday, October 23, 2002 12:54 PM
Subject: Re: [newbie] How do I send a custom HTTP::Response?


 Chris Pizzo wrote:
  The documentation tells me how to create a new response object but how
do I
  reply to a request using my custom response?

 HTTP::Response?  That's an LWP thing, not a mod_perl thing.  Maybe
 you're a little confused here?  Tell us what you're trying to do.

 - Perrin






Re: [newbie] How do I send a custom HTTP::Response?

2002-10-23 Thread Perrin Harkins
Chris Pizzo wrote:

OK,  I am getting a request from a server and I need to respond with an XML
doc.


So your mod_perl handler is getting an HTTP request and you want to send 
back an XML document?  No problem, just send it.  Set the content-type 
if you need to.  There's no trick to it.  See the mailing list archives 
for more on this.

- Perrin



Terminating an HTTP process

2002-07-23 Thread Mark Ridgway

Hi All,

We're running Mason 1.04 (Apache/1.3.22 (Solaris) mod_perl/1.26
mod_ssl/2.8.5 OpenSSL/0.9.6b) which connects to a number of Oracle
databases, so we are using Apache::DBI for connection pooling.

As we understand it, each process has its own connection pool that
lasts for the life of that process (i.e. each child collects its own
set of DB handles until it dies).

Whilst this is normally not an issue, when we experience network
problems, the number of available connections on some crucial databases
can quickly run out, which many open connections sitting idle in the
pool of various clients using other DBs, etc.

What we'd like to do is send some kind of quit signal to the process
so that it finishes that transaction and dies, instead of waiting until
MaxRequestsPerChild.  This will ensure that DB handles for this
particular DB will not be idly pooled, but instead constantly in use.

Is there a command to do this? (e.g. like 'abort', but one that
completes the transaction successfully, and kills the child process).
(e.g. $r-die() :-)

Also, does anyone know how to get the current RequestsPerChild
counter (that MaxRequestsPerChild evaluates against)?


Thanks!

Mark





Re: Terminating an HTTP process

2002-07-23 Thread Enrico Sorcinelli

On Tue, 23 Jul 2002 18:58:14 +1000
Mark Ridgway [EMAIL PROTECTED] wrote:

 Hi All,
 
 We're running Mason 1.04 (Apache/1.3.22 (Solaris) mod_perl/1.26
 mod_ssl/2.8.5 OpenSSL/0.9.6b) which connects to a number of Oracle
 databases, so we are using Apache::DBI for connection pooling.
 
 As we understand it, each process has its own connection pool that
 lasts for the life of that process (i.e. each child collects its own
 set of DB handles until it dies).
 
 Whilst this is normally not an issue, when we experience network
 problems, the number of available connections on some crucial databases
 can quickly run out, which many open connections sitting idle in the
 pool of various clients using other DBs, etc.
 
 What we'd like to do is send some kind of quit signal to the process
 so that it finishes that transaction and dies, instead of waiting until
 MaxRequestsPerChild.  This will ensure that DB handles for this
 particular DB will not be idly pooled, but instead constantly in use.
 
 Is there a command to do this? (e.g. like 'abort', but one that
 completes the transaction successfully, and kills the child process).
 (e.g. $r-die() :-)
 
 Also, does anyone know how to get the current RequestsPerChild
 counter (that MaxRequestsPerChild evaluates against)?
 

 


Hi Mark,
if you don't need persistent connections for all Oracle DBs
you can try to use some nonpersistent connections by setting
'dbi_connect_method' property to 'connect' in DBI connect
hash options:

  my $dbh = DBI-connect('dbi:...',... ,{'dbi_connect_method' = 'connect'});

However I think that the right solution is a connection pool server 
like SQLRealy (it work very well with Oracle, MySQL, PostgresSQL, DB2, etc)

Bye

Enrico

=
Enrico Sorcinelli - Gruppo E-Comm - Italia On Line S.p.a.
E-Mail: [EMAIL PROTECTED] - [EMAIL PROTECTED]
=



Re: Optional HTTP Authentication ?

2002-07-01 Thread Jean-Michel Hiver

 However, if the structure were
 
 http://bigmegamarket.com/index.pl/56765454151/grocery/fruits/bananas,
 say, with the number being the session ID, the URL then is hackable
 within that (good) definition.

Yes, however there are quite a number of issues with bookmarks and
search engines... But that's for sure another interesting and less-ugly
option.


 I'm a big fan of cookies myself, for the thing they were made for,
 namely session tracking.  I share your frustration :-(.

Yep. It's a shame that cookies which were a good idea at first get such
a bad name because of all these moronic marketing companies which dream
of knowing you inside out to send you more shit spam abuse them. But I'm
off topic here :-)

Cheers,
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: Optional HTTP Authentication ?

2002-07-01 Thread Jean-Michel Hiver

  browser sent the credentials, or leave $ENV{REMOTE_USER} undef
  otherwise, without sending a 401 back.
 
 I didn't think a browser would send authentication unless the server
 requested it for an authentication domain.  How are you going to 
 get some people to send the credentials and some not unless you
 use different URLs so the server knows when to request them?

The idea is that on a location which requires authentication I'll
redirect the user to a /login.html, or maybe a /?login=1 which will do
the following:

IF user is authenticated = redirect to location it came from
ELSE send 401 authorization required

This way users should get a login box strictly when necessary. Almost
all the request go thru an Apache::Registry friendly CGI script:

Alias /.static /opt/chico/static
Alias //opt/mkd/cgi/mkdoc.cgi/

Everything is treated using $ENV{PATH_INFO} in the script, and the
script knows when something needs authentication or not.


 Note that you don't have to embed session info here, just add
 some element to the URL that serves as the point where you
 request credentials and omit it for people that don't log in.  Or
 redirect to a different vhost that always requires authentication but
 serves the same data.

Oh but I have that already. I know that I need to password protect

/properties.html
/content.html
/move.html
/foo/properties.html
/foo/content.html
/foo/move.html
etc...

Is it possible to password-protect a class of URIs using regexes? That
would be another good option.

Cheers,
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: Optional HTTP Authentication ?

2002-07-01 Thread Les Mikesell

From: Jean-Michel Hiver [EMAIL PROTECTED]

 Oh but I have that already. I know that I need to password protect
 
 /properties.html
 /content.html
 /move.html
 /foo/properties.html
 /foo/content.html
 /foo/move.html
 etc...
 
 Is it possible to password-protect a class of URIs using regexes? That
 would be another good option.

I thought you meant that you wanted the same location to be
accessed by different people with/without passwords.  You
should be able to put the authentication directives in a 
LocationMatch  container in this case.   Another approach
would be to use mod_rewrite to map the request to a directory
containing a symlink to the script and an appropriate .htaccess file.
This is kind of brute-force but it lets you do anything you want with
a request including proxying to an otherwise unreachable port or
server for certain content. Unfortunately I think the symlink approach
appears as a different script to mod_perl so it will cache a separate
copy in memory.

   Les Mikesell
[EMAIL PROTECTED]



Re: Optional HTTP Authentication ?

2002-07-01 Thread Robert Landrum

On Mon, Jul 01, 2002 at 10:30:36AM +0100, Jean-Michel Hiver wrote:
   browser sent the credentials, or leave $ENV{REMOTE_USER} undef
   otherwise, without sending a 401 back.
  
  I didn't think a browser would send authentication unless the server
  requested it for an authentication domain.  How are you going to 
  get some people to send the credentials and some not unless you
  use different URLs so the server knows when to request them?
 
 The idea is that on a location which requires authentication I'll
 redirect the user to a /login.html, or maybe a /?login=1 which will do
 the following:

Umm... Perhaps I don't understand the significance of the login.html.  Under
HTTP auth, if a page is protected via .htaccess then auth is immediatly 
requested, and no redirect is possible.

More important is the fact that if a page does not require authentication,
the users login and password will not be sent.  So a page like index.html that
is not normally authenticated will not receive the username, and no
a href=/adminAdmin this page/a will be possible.

I'm not 100% sure this is possible without the use of cookies.  I'm pretty sure
you could write some custom handler to handle the auth, but without a cookie
to note which users have authenticated, you might be out of luck.

Good luck,

Rob



Re: Optional HTTP Authentication ?

2002-07-01 Thread Jean-Michel Hiver

Thanks to the list and two days of hard work, I have my optional HTTP
authentication thingie working :-)

Basically here is how it looks in my apache config file:

# This method handler ensures that users must enter their credentials
# for any URI which looks like /foo/bar/login.html
LocationMatch .*/login.html$
  AuthName MKDoc Login
  AuthType Basic
  PerlAuthenHandler MKDoc::Auth::SQL_HTTP-handler
  require valid-user
/LocationMatch

# This method handler affects the whole site, it sets the
# $ENV{REMOTE_USER} variable if the credentials have been sent, or
# leave it undef otherwise. 
Location /
  PerlFixupHandler
  MKDoc::Auth::SQL_HTTP-handler_opt
/Location

# if the user successfully logged in when hitting a /foo/bar/login.html
# location, then we want to redirect him where he came from
LocationMatch .*/login.html$
  SetHandler perl-script
  PerlHandler
  MKDoc::Auth::SQL_HTTP-handler_redirect
  require valid-user
/LocationMatch

more perl handlers here


* Now if you go to /properties.html BEFORE sending the credentials,
* You're redirected to /login.html?from=/properties.html where you login,
* Which redirects you to /properties.html... but this time your browser
sends the credentials!

This is interesting because it's up to the handlers to decide wether
they need authentication or not and does non depend on the location.


 More important is the fact that if a page does not require authentication,
 the users login and password will not be sent.  So a page like index.html that
 is not normally authenticated will not receive the username, and no
 a href=/adminAdmin this page/a will be possible.

This is not true, once you've entered the credentials on /login.html the
browsers send them everywhere. Tested under Opera (Linux), Mozilla
(Linux) and IE from version 3 to version 6 (Windows), IE 3 (Mac),
Netscape 4 (Mac).

One exception: links :-(. But the browser support seems to be there...

In the future I plan to have some kind of hybrid handler which would
accept either HTTP credentials OR a cookie... that would be cool :-)


 I'm not 100% sure this is possible without the use of cookies.  I'm pretty sure
 you could write some custom handler to handle the auth, but without a cookie
 to note which users have authenticated, you might be out of luck.

Well I seem to have done it, so it must be possible thanks to you guys
;-. I will send the code to anyone who's interested but I don't want
to post it to the list because I suspect that most people aren't.


Thank you everyone,
Cheers,
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: Optional HTTP Authentication ?

2002-07-01 Thread Ged Haywood

Hi there,

On 30 Jun 2002, Randal L. Schwartz wrote:

 What?  The EU is going to make cookies *illegal*?  I highly doubt this.

There is already EU legislation which might make the use of cookies suspect.
It concerns, for example, the monitoring of individual keyboard operators
to measure their performance.  That's been illegal in the EU for some time.
You only have to start counting your cookies to be treading on shaky ground.

73,
Ged.

BTW it's modperlatperldotapachedotorg




Re: Optional HTTP Authentication ?

2002-07-01 Thread David Dyer-Bennet

Jean-Michel Hiver [EMAIL PROTECTED] writes:

  However, if the structure were
  
  http://bigmegamarket.com/index.pl/56765454151/grocery/fruits/bananas,
  say, with the number being the session ID, the URL then is hackable
  within that (good) definition.
 
 Yes, however there are quite a number of issues with bookmarks and
 search engines... But that's for sure another interesting and less-ugly
 option.

Very true.  I was solving only the stated problem, and didn't think
much about the other problems that would then appear. 

  I'm a big fan of cookies myself, for the thing they were made for,
  namely session tracking.  I share your frustration :-(.
 
 Yep. It's a shame that cookies which were a good idea at first get such
 a bad name because of all these moronic marketing companies which dream
 of knowing you inside out to send you more shit spam abuse them. But I'm
 off topic here :-)

And that's all it is; a bad *name*.  With the option to refuse to
deliver cookies to a domain different from the domain of the top-level
page, they have no actual problems.  And they solve the problem
they're supposed to solve nearly perfectly. 

Obviously for individual projects one has to do what the people with
the checkbook say, but we shouldn't be just rolling over on cookies;
we should be arguing the point strenuously.
-- 
David Dyer-Bennet, [EMAIL PROTECTED]  /  New TMDA anti-spam in test
 John Dyer-Bennet 1915-2002 Memorial Site http://john.dyer-bennet.net
Book log: http://www.dd-b.net/dd-b/Ouroboros/booknotes/
 New Dragaera mailing lists, see http://dragaera.info



Re: [OT] Optional HTTP Authentication ?

2002-06-30 Thread Jean-Michel Hiver

 This seems a little off topic.  I think this is an architecture question, not
 a mod perl question.

Well, a bit of both I guess.

 Basically, you want all you protected files to be located in /protected or
 some other directory...

No that is not possible. I am running a web application, there are no
such things as 'files' (everything is done using PATH_INFO), only
locations.

Users can create as many locations as they want (i.e. /foo/bar/) and
administrate them using URIs such as /foo/bar/properties.html,
/foo/bar/contents.html, etc.

There are some locations which do not need to be protected, i.e.

/foo/bar/
/foo/bar/print.html
/foo/bar/dc.xml
/foo/bar/rss100.rdf


But some others need to, like:

/foo/bar/properties.html
/foo/bar/contents.html
/foo/bar/move.html
etc.


I want to use HTTP authentication for that, but of course I cannot
password protect the whole site, because public users would not be so
happy!

Any ideas?
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: [OT] Optional HTTP Authentication ?

2002-06-30 Thread Jean-Michel Hiver

 Oh, I don't know, I think the poster was asking about how to produce this
 effect with mod_perl.  He wants to know *whether* a login was provided, even
 on a *non-protected* page.  That would let you say (while serving any old
 page):
 
 if( $ENV{REMOTE_USER} eq 'admin' ) {
   $r-print('Yo, you can do a href=/admin/extra kewl stuff/a here.');
 }

Yes, that is quite the case.


 In one of the earlier stages of processing - maybe a FixupHandler or ? a
 AuthenHandler might be appropriate - you can do something like this:
 
 my $a = $r-header_in('Authorization');
 $a =~ s/^Basic (.*)/$1/;
 my( $user, $pass ) = split(':', decode_base64( $a ) );
 
 if( check the username/password as you wish ) {
   $ENV{REMOTE_USER} = $user;
 }
 
 So, now you can tell later during the request with a username/password was
 offered (and you know it was a valid login/pass combo).

That's very interesting! I don't think I can use an auth handler because
then I would have to password protect the whole site (which I don't want
to).

I want to have just ONE page which is password protected (i.e.
/login.html). The page would just be a redirect, but once the user
entered his credentials then the browser should send them on the whole
site and then I could do the following:

/foo/properties.html

  IF authenticated
 IF authorized = trigger /foo/properties.html
 ELSE  = send custom error page
  ELSE
 redirect to /login.html?from=uri


Anyway I'm going to try that fixup handler thingie and I'll tell you how
it goes :-)

Cheers,
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: Optional HTTP Authentication ?

2002-06-30 Thread Jean-Michel Hiver

 It seems that Apache::AuthCookie allows a way to make areas
 to which one can authenticate if s/he wants. I suppose that 
 then in those areas you can tell if the user is logged in 
 and affect the pages if so.

Indeed the best option would be to be using one of the Apache::Session
module and use the provided hash to store the login information. I have
read the whole portion of the eagle book dedicated to authentication /
authorization before posting my crack-smoked question to the list ;-)

Unfortunatly:

* For political reasons and compliance with future european legislation
  I cannot use cookies,

* For usability reasons encoding session IDs on URIs would be really
  bad... users needs to be able to 'hack' the URIs without f***ing their
  sessions!

Therefore I have to use HTTP authentication...
Cheers,
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: Optional HTTP Authentication ?

2002-06-30 Thread Randal L. Schwartz

 Jean-Michel == Jean-Michel Hiver [EMAIL PROTECTED] writes:

Jean-Michel * For political reasons and compliance with future european legislation
Jean-Michel   I cannot use cookies,

What?  The EU is going to make cookies *illegal*?  I highly doubt
this.

Jean-Michel * For usability reasons encoding session IDs on URIs would be really
Jean-Michel   bad... users needs to be able to 'hack' the URIs without f***ing their
Jean-Michel   sessions!

Why is a user hacking their URLs?

Jean-Michel Therefore I have to use HTTP authentication...

Even though the user/password is transmitted *in the clear* on
*every single hit*, because you can't just use a session identifier?
This is so very wrong from a security perspective.


-- 
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
[EMAIL PROTECTED] URL:http://www.stonehenge.com/merlyn/
Perl/Unix/security consulting, Technical writing, Comedy, etc. etc.
See PerlTraining.Stonehenge.com for onsite and open-enrollment Perl training!



Re: Optional HTTP Authentication ?

2002-06-30 Thread Jean-Michel Hiver

 What?  The EU is going to make cookies *illegal*?  I highly doubt
 this.

Sorry, I am neither the lawyer nor the client, so I can't tell you...
I know it's really stupid, but I am going to have to deal without
cookies.

 Jean-Michel * For usability reasons encoding session IDs on URIs would be really
 Jean-Michel   bad... users needs to be able to 'hack' the URIs without f***ing their
 Jean-Michel   sessions!
 
 Why is a user hacking their URLs?

I can answer that.  http://www.useit.com/alertbox/990321.html

cite
  * a domain name that is easy to remember and easy to spell
  * short URLs
  * easy-to-type URLs
  * URLs that visualize the site structure
  * URLs that are hackable to allow users to move to higher levels of
the information architecture by hacking off the end of the URL
  * persistent URLs that don't change 
/cite

i.e. http://bigmegamarket.com/grocery/fruits/bananas/ is cool,
http://bigmegamarket.com/index.pl?id=231223412sid=56765454151 is not.

Again it doesn't always make implementation easy :-/ 

 Jean-Michel Therefore I have to use HTTP authentication...
 
 Even though the user/password is transmitted *in the clear* on
 *every single hit*, because you can't just use a session identifier?
 This is so very wrong from a security perspective.

I have to agree with you on that. Cookies are probably far better than
HTTP authentication. But I cannot use cookies. Period. I wish I could,
because this was what I did in the first place and it was working fine!

Cheers,
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: Optional HTTP Authentication ?

2002-06-30 Thread Peter Bi

Please check that the idea of this kind of authentication is to encrypt the
ticket, instead of a plain session ID.  If cookie is not available,  having
it on URI is a good idea. (Then one needs to have all links in a relative
manner; see the Cookbook). Cookie itself does not make a secure session ID
or a secure ticket. It is the encryption that does.

Peter Bi

- Original Message -
From: Jean-Michel Hiver [EMAIL PROTECTED]
To: Randal L. Schwartz [EMAIL PROTECTED]
Cc: Jean-Michel Hiver [EMAIL PROTECTED]; Andrew Moore
[EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Sunday, June 30, 2002 10:07 AM
Subject: Re: Optional HTTP Authentication ?


  What?  The EU is going to make cookies *illegal*?  I highly doubt
  this.

 Sorry, I am neither the lawyer nor the client, so I can't tell you...
 I know it's really stupid, but I am going to have to deal without
 cookies.

  Jean-Michel * For usability reasons encoding session IDs on URIs would
be really
  Jean-Michel   bad... users needs to be able to 'hack' the URIs without
f***ing their
  Jean-Michel   sessions!
 
  Why is a user hacking their URLs?

 I can answer that.  http://www.useit.com/alertbox/990321.html

 cite
   * a domain name that is easy to remember and easy to spell
   * short URLs
   * easy-to-type URLs
   * URLs that visualize the site structure
   * URLs that are hackable to allow users to move to higher levels of
 the information architecture by hacking off the end of the URL
   * persistent URLs that don't change
 /cite

 i.e. http://bigmegamarket.com/grocery/fruits/bananas/ is cool,
 http://bigmegamarket.com/index.pl?id=231223412sid=56765454151 is not.

 Again it doesn't always make implementation easy :-/

  Jean-Michel Therefore I have to use HTTP authentication...
 
  Even though the user/password is transmitted *in the clear* on
  *every single hit*, because you can't just use a session identifier?
  This is so very wrong from a security perspective.

 I have to agree with you on that. Cookies are probably far better than
 HTTP authentication. But I cannot use cookies. Period. I wish I could,
 because this was what I did in the first place and it was working fine!

 Cheers,
 --
 IT'S TIME FOR A DIFFERENT KIND OF WEB
 
   Jean-Michel Hiver - Software Director
   [EMAIL PROTECTED]
   +44 (0)114 255 8097
 
   VISIT HTTP://WWW.MKDOC.COM




Re: Optional HTTP Authentication ?

2002-06-30 Thread Jean-Michel Hiver

On Sun 30-Jun-2002 at 10:47:26AM -0700, Peter Bi wrote:
 Please check that the idea of this kind of authentication is to encrypt the
 ticket, instead of a plain session ID.  If cookie is not available,  having
 it on URI is a good idea. (Then one needs to have all links in a relative
 manner; see the Cookbook). Cookie itself does not make a secure session ID
 or a secure ticket. It is the encryption that does.

I *CANNOT* use cookies nor URIs for any kind of session tracking.
Otherwise I don't think I would have posted this message to the list in
the first place :-)

I agree that HTTP Basic authentication is totally and uterly ugly, but I
am going to have to stick with it no matter what... My problem is:

How do I tell apache to set the $ENV{REMOTE_USER} variable if the
browser sent the credentials, or leave $ENV{REMOTE_USER} undef
otherwise, without sending a 401 back.

Cheers,
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: [OT] Optional HTTP Authentication ?

2002-06-30 Thread Jean-Michel Hiver

 In one of the earlier stages of processing - maybe a FixupHandler or ? a
 AuthenHandler might be appropriate - you can do something like this:
 
 my $a = $r-header_in('Authorization');
 $a =~ s/^Basic (.*)/$1/;
 my( $user, $pass ) = split(':', decode_base64( $a ) );
 
 if( check the username/password as you wish ) {
   $ENV{REMOTE_USER} = $user;
 }

OK, I got this working using a fixup handler BUT there is a nasty trap.

It happens that the environment variables which you set from Perl aren't
inherited from sub-processes... which means that this technique is fine
if the script that comes after authentication runs under
Apache::Registry.

Unfortunately, I might need the script to run under mod_cgi... I
couldn't find how to tell the apache server to set environmental
variables in the mod_perl pocket reference, anyone has got an idea?

Cheers,
-- 
IT'S TIME FOR A DIFFERENT KIND OF WEB

  Jean-Michel Hiver - Software Director
  [EMAIL PROTECTED]
  +44 (0)114 255 8097

  VISIT HTTP://WWW.MKDOC.COM



Re: Optional HTTP Authentication ?

2002-06-30 Thread David Dyer-Bennet

Jean-Michel Hiver [EMAIL PROTECTED] writes:

  Why is a user hacking their URLs?
 
 I can answer that.  http://www.useit.com/alertbox/990321.html
 
 cite
   * a domain name that is easy to remember and easy to spell
   * short URLs
   * easy-to-type URLs
   * URLs that visualize the site structure
   * URLs that are hackable to allow users to move to higher levels of
 the information architecture by hacking off the end of the URL
   * persistent URLs that don't change 
 /cite

I generly agree with alertbox, and I agree in this instance.

 i.e. http://bigmegamarket.com/grocery/fruits/bananas/ is cool,
 http://bigmegamarket.com/index.pl?id=231223412sid=56765454151 is not.

Both true.

However, if the structure were

http://bigmegamarket.com/index.pl/56765454151/grocery/fruits/bananas,
say, with the number being the session ID, the URL then is hackable
within that (good) definition.

 Again it doesn't always make implementation easy :-/ 

True enough; and my proposal is a bit harder to implement.

I'm a big fan of cookies myself, for the thing they were made for,
namely session tracking.  I share your frustration :-(.
-- 
David Dyer-Bennet, [EMAIL PROTECTED]  /  New TMDA anti-spam in test
 John Dyer-Bennet 1915-2002 Memorial Site http://john.dyer-bennet.net
Book log: http://www.dd-b.net/dd-b/Ouroboros/booknotes/
 New Dragaera mailing lists, see http://dragaera.info



Re: Optional HTTP Authentication ?

2002-06-30 Thread Peter Bi

Hi, Jean-Michel:  the official way to retrieve the remote user name under
Basic Authentication is to call for $r-connect-user(), or $r-user() in
mod_perl 2.0, I think. With a ticket authentication, one gets the user name
in the same way only AFTER the access control phase, because it is simulated
from the ticket, see e.g. my Apache::CookieAccess source at
modperl.home.att.net. BTW, for me, Basic Authnetication is not that ugly, it
is surpringly stable (than most of other Apache ideas)  since day one.

Peter Bi

- Original Message -
From: Jean-Michel Hiver [EMAIL PROTECTED]
To: Peter Bi [EMAIL PROTECTED]
Cc: Jean-Michel Hiver [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Sunday, June 30, 2002 12:20 PM
Subject: Re: Optional HTTP Authentication ?


 On Sun 30-Jun-2002 at 10:47:26AM -0700, Peter Bi wrote:
  Please check that the idea of this kind of authentication is to encrypt
the
  ticket, instead of a plain session ID.  If cookie is not available,
having
  it on URI is a good idea. (Then one needs to have all links in a
relative
  manner; see the Cookbook). Cookie itself does not make a secure session
ID
  or a secure ticket. It is the encryption that does.

 I *CANNOT* use cookies nor URIs for any kind of session tracking.
 Otherwise I don't think I would have posted this message to the list in
 the first place :-)

 I agree that HTTP Basic authentication is totally and uterly ugly, but I
 am going to have to stick with it no matter what... My problem is:

 How do I tell apache to set the $ENV{REMOTE_USER} variable if the
 browser sent the credentials, or leave $ENV{REMOTE_USER} undef
 otherwise, without sending a 401 back.

 Cheers,
 --
 IT'S TIME FOR A DIFFERENT KIND OF WEB
 
   Jean-Michel Hiver - Software Director
   [EMAIL PROTECTED]
   +44 (0)114 255 8097
 
   VISIT HTTP://WWW.MKDOC.COM





Re: Optional HTTP Authentication ?

2002-06-30 Thread Les Mikesell

From: Jean-Michel Hiver [EMAIL PROTECTED]
 
 I *CANNOT* use cookies nor URIs for any kind of session tracking.
 Otherwise I don't think I would have posted this message to the list in
 the first place :-)
 
 I agree that HTTP Basic authentication is totally and uterly ugly, but I
 am going to have to stick with it no matter what... My problem is:
 
 How do I tell apache to set the $ENV{REMOTE_USER} variable if the
 browser sent the credentials, or leave $ENV{REMOTE_USER} undef
 otherwise, without sending a 401 back.

I didn't think a browser would send authentication unless the server
requested it for an authentication domain.  How are you going to 
get some people to send the credentials and some not unless you
use different URLs so the server knows when to request them?
Note that you don't have to embed session info here, just add
some element to the URL that serves as the point where you
request credentials and omit it for people that don't log in.  Or
redirect to a different vhost that always requires authentication but
serves the same data.


   Les Mikesell
  [EMAIL PROTECTED]




Re: HTTP version conflict with proxy setup.

2002-05-14 Thread Ian Macdonald

Just a quick note on the successful resolution of this; it's a bug in
Apache
1.3.24 with the handling of chunked data from proxies, details here:

http://marc.theaimsgroup.com/?l=apache-httpd-devm=101805692511019w=2
http://marc.theaimsgroup.com/?l=apache-httpd-devm=101810478231242w=2

and there's a patch available here (or your local equivalent):

http://apache.planetmirror.com.au/dist/httpd/patches/apply_to_1.3.24/proxy_http1.1_chunking.patch


Anybody who wants to use the ProxyPass feature in a front-end Apache
1.3.24 to get to a mod_perl-enabled server should install this patch.

Thanks
Ian Macdonald

Ian Macdonald wrote:

 Hi,

 More info on my problem with my proxied (via Rewrite rules) mod_perl
 apache giving responses that were a) topped  tailed with message size
 and 0 respectively, and b) delayed for about 15 seconds; thanks to
 Igor S. for tipping me off that this was chunking and therefore it was a
 HTTP 1.0 vs 1.1 problem, but I still haven't managed to completely solve
 it.

 Short version:

 Can I tell my mod_perl apache to either only ever respond with 1.0 HTTP,
 or to not use chunking? (Or how do I tell my main server not to
 promote the version when it routes the original request to the
 mod_perl server?)

 Setting force-response-1.0  downgrade-1.0 to 1 doesn't seem to work
 (completely).

 Long version:

 The problem appears to be that the browser I'm using for testing
 (Netscape 4.78 on Redhat) is issuing GET xxx 1.0 requests; these are
 recognised as perl resources and passed to the mod_perl apache but the
 new requests are like GET xxx 1.1. The mod_perl apache obediently sends
 back a chunked 1.1 answer, which the browser can't handle. As
 confirmation I tried a different test machine running Netscape 6.? and
 the aberrant behaviour disappeared.  When I go straight to the mod_perl
 server by specifying the port in the URL, it works because the original
 request is 1.0 and so that is what is returned.

 I've tried adding SetEnv force-response-1.0 1 and SetEnv
 downgrade-1.0 1 in both servers config, but the only effect I've
 noticed is when I add force-response-1.0 to the main server, the delay
 goes away; the size  0 (ie the chunking info) still top  tail the
 response.

 Thanks
 Ian Macdonald
 [EMAIL PROTECTED]



HTTP version conflict with proxy setup.

2002-05-13 Thread Ian Macdonald

Hi,

More info on my problem with my proxied (via Rewrite rules) mod_perl
apache giving responses that were a) topped  tailed with message size
and 0 respectively, and b) delayed for about 15 seconds; thanks to
Igor S. for tipping me off that this was chunking and therefore it was a
HTTP 1.0 vs 1.1 problem, but I still haven't managed to completely solve
it.

Short version:

Can I tell my mod_perl apache to either only ever respond with 1.0 HTTP,
or to not use chunking? (Or how do I tell my main server not to
promote the version when it routes the original request to the
mod_perl server?)

Setting force-response-1.0  downgrade-1.0 to 1 doesn't seem to work
(completely).

Long version:

The problem appears to be that the browser I'm using for testing
(Netscape 4.78 on Redhat) is issuing GET xxx 1.0 requests; these are
recognised as perl resources and passed to the mod_perl apache but the
new requests are like GET xxx 1.1. The mod_perl apache obediently sends
back a chunked 1.1 answer, which the browser can't handle. As
confirmation I tried a different test machine running Netscape 6.? and
the aberrant behaviour disappeared.  When I go straight to the mod_perl
server by specifying the port in the URL, it works because the original
request is 1.0 and so that is what is returned.

I've tried adding SetEnv force-response-1.0 1 and SetEnv
downgrade-1.0 1 in both servers config, but the only effect I've
noticed is when I add force-response-1.0 to the main server, the delay
goes away; the size  0 (ie the chunking info) still top  tail the
response.

Thanks
Ian Macdonald
[EMAIL PROTECTED]



mod_perl jobs at http://perl.apache.org/jobs.html

2002-02-25 Thread Stas Bekman

While testing the links on the new site I've stumbled upon this link: 
http://perl.apache.org/jobs.html and was surprised to find out that 
there are fresh mod_perl jobs there and while not many, there seem to be 
at least one post every few days. So if you didn't know about this 
resource, now you do ;)

Hopefully the new mod_perl cookbook will create even more jobs for the 
mod_perl community.

And to remind you that it's OK to post both job offers and job requests 
to this list, as long as you start the subject with a proper tag, 
something like [JOB WANTED], [JOB OFFER] or similar.

_
Stas Bekman JAm_pH  --   Just Another mod_perl Hacker
http://stason.org/  mod_perl Guide   http://perl.apache.org/guide
mailto:[EMAIL PROTECTED]  http://ticketmaster.com http://apacheweek.com
http://singlesheaven.com http://perl.apache.org http://perlmonth.com/




Re: ANNOUNCE: HTTP::TestEngine v0.02

2002-01-29 Thread Dave Rolsky

On Tue, 29 Jan 2002, Chris Brooks wrote:

 I have released version 0.02 of HTTP::TestEngine to sourceforge.
 TestEngine acts as an http session recorder.  After setting a cookie,
 a user can record a session by simply clicking links in their browser:
 filenames, paths and parameters are written to the filesystem for recall
 in the future.  Session data is stored in test harness-independent
 format, and a second module, HTTP::SessionConfig is responsible for
 converting the session data into a data structure that is appropriate
 for play-back.  HTTP::Monkeywrench is currently supported, with plans
 for HTTP::WebTest support in the future.

This is funny.  I was working on something very similar recently.  It was
designed to be able to work with both vanilla CGI and mod_perl (via
Apache::Filter).  It's more cross-platform that your current code and also
records a lot more info.  Maybe we can work together to integrate our
code.  We can probably talk more off the list.


-dave

/*==
www.urth.org
we await the New Sun
==*/





Re: ANNOUNCE: HTTP::TestEngine v0.02

2002-01-29 Thread Ilya Martynov

 On Tue, 29 Jan 2002 13:03:56 -0600 (CST), Dave Rolsky [EMAIL PROTECTED] said:

DR On Tue, 29 Jan 2002, Chris Brooks wrote:
 I have released version 0.02 of HTTP::TestEngine to sourceforge.
 TestEngine acts as an http session recorder.  After setting a cookie,
 a user can record a session by simply clicking links in their browser:
 filenames, paths and parameters are written to the filesystem for recall
 in the future.  Session data is stored in test harness-independent
 format, and a second module, HTTP::SessionConfig is responsible for
 converting the session data into a data structure that is appropriate
 for play-back.  HTTP::Monkeywrench is currently supported, with plans
 for HTTP::WebTest support in the future.

DR This is funny.  I was working on something very similar recently.  It was
DR designed to be able to work with both vanilla CGI and mod_perl (via
DR Apache::Filter).  It's more cross-platform that your current code and also
DR records a lot more info.  Maybe we can work together to integrate our
DR code.  We can probably talk more off the list.

I quite interested in this project also (as maintainer of
HTTP::WebTest :). Something like HTTP::TestEngine which can record
tests for HTTP::WebTest was in my TODO for a long time. I just had no
time for it ;(

-- 
 -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
| Ilya Martynov (http://martynov.org/)  TIV.net (http://tiv.net/) |
| GnuPG 1024D/323BDEE6 D7F7 561E 4C1D 8A15 8E80  E4AE BE1A 53EB 323B DEE6 |
 -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-



is there something wrong with my http header?

2002-01-28 Thread Alex Porras

I'm trying to print a gif image to the browser, but it's appearing as
text.  Here's what the output looks like (used lynx --mime_header):

HTTP/1.1 200 OK
Date: Mon, 28 Jan 2002 21:58:05 GMT
Server: Apache/1.3.20 (Unix) mod_perl/1.26
Set-Cookie: FOO=bar; domain=foo.bar; path=/
Pragma: no-cache
Cache-control: no-cache
Connection: close
Content-Type: image/gif
Expires: Mon, 28 Jan 2002 21:58:05 GMT

R0lGODdhOAAWAPcAVQAAqgAA/wAkAAAkVQAkqgAk/wBJAABJVQBJqgBJ/wBtAABt
VQBt
...(more data)...


My script does the following:

$r-no_cache( 1 );
$r-content_type('image/gif');
$r-send_http_header;
$r-print( $data );
$r-exit(OK);

Any thoughts?

TIA,

--Alex



Re: is there something wrong with my http header?

2002-01-28 Thread Robert Landrum

I'm trying to print a gif image to the browser, but it's appearing as
text.  Here's what the output looks like (used lynx --mime_header):

HTTP/1.1 200 OK
Date: Mon, 28 Jan 2002 21:58:05 GMT
Server: Apache/1.3.20 (Unix) mod_perl/1.26
Set-Cookie: FOO=bar; domain=foo.bar; path=/
Pragma: no-cache
Cache-control: no-cache
Connection: close
Content-Type: image/gif
Expires: Mon, 28 Jan 2002 21:58:05 GMT

R0lGODdhOAAWAPcAVQAAqgAA/wAkAAAkVQAkqgAk/wBJAABJVQBJqgBJ/wBtAABt
VQBt
...(more data)...


Uhh... That's not gif data.  gif data should start

GIF89a...(more data)...

Rob


--
When I used a Mac, they laughed because I had no command prompt. When 
I used Linux, they laughed because I had no GUI.  



RE: is there something wrong with my http header?

2002-01-28 Thread Alex Porras

I'm a goof.  That data is from an imap server--I forgot to decode it
first.

Thanks,

--Alex

 -Original Message-
 From: Robert Landrum [mailto:[EMAIL PROTECTED]]
 Sent: Monday, January 28, 2002 4:16 PM
 To: [EMAIL PROTECTED]
 Subject: Re: is there something wrong with my http header?
 
 
 I'm trying to print a gif image to the browser, but it's appearing as
 text.  Here's what the output looks like (used lynx --mime_header):
 
 HTTP/1.1 200 OK
 Date: Mon, 28 Jan 2002 21:58:05 GMT
 Server: Apache/1.3.20 (Unix) mod_perl/1.26
 Set-Cookie: FOO=bar; domain=foo.bar; path=/
 Pragma: no-cache
 Cache-control: no-cache
 Connection: close
 Content-Type: image/gif
 Expires: Mon, 28 Jan 2002 21:58:05 GMT
 
 R0lGODdhOAAWAPcAVQAAqgAA/wAkAAAkVQAkqgAk/wBJAABJVQBJq
 gBJ/wBtAABt
 VQBt
 ...(more data)...
 
 
 Uhh... That's not gif data.  gif data should start
 
 GIF89a...(more data)...
 
 Rob
 
 
 --
 When I used a Mac, they laughed because I had no command prompt. When 
 I used Linux, they laughed because I had no GUI.  
 



ANNOUNCE: HTTP::TestEngine v0.02

2002-01-28 Thread Chris Brooks

Good morning,

I have released version 0.02 of HTTP::TestEngine to sourceforge.
TestEngine acts as an http session recorder.  After setting a cookie,
a user can record a session by simply clicking links in their browser:
filenames, paths and parameters are written to the filesystem for recall
in the future.  Session data is stored in test harness-independent
format, and a second module, HTTP::SessionConfig is responsible for
converting the session data into a data structure that is appropriate
for play-back.  HTTP::Monkeywrench is currently supported, with plans
for HTTP::WebTest support in the future.

I would appreciate any feedback and suggestions that people have.  You
can download the latest version at
http://sourceforge.net/projects/http-recorder/

Thanks,
Chris
--

Chris Brooks
Director of Technology
CareScout.com
phone: (781) 431-7033 x 342





META tags added as HTTP headers

2002-01-18 Thread Markus Wichitill

Hi,

which part of an Apache/mod_perl setup is responsible for extracting META
tags from generated HTML and adding them as HTTP headers (even with
PerlSendHeaders Off)? In the case of META NAME='Blah' tags, it adds
X-Meta-Blah headers, which are harmless but probably mostly a waste of
bandwidth. But in the case of META HTTP-EQUIV tags, it adds non-X headers
without caring about existing headers, often leading to double headers,
which may also conflict:

  shell lwp-request -sued http://www.apache.org/
  GET http://www.apache.org/
  200 OK
  Cache-Control: max-age=86400
  Connection: close
  Date: Fri, 18 Jan 2002 23:54:30 GMT
  Accept-Ranges: bytes
  Server: Apache/2.0.28 (Unix)
  Content-Length: 7810
  Content-Type: text/html
  Content-Type: text/html; charset=iso-8859-1
  Expires: Sat, 19 Jan 2002 23:54:30 GMT
  Title: Welcome! - The Apache Software Foundation
  X-Meta-Author: ASF
  X-Meta-Email: [EMAIL PROTECTED]

The way I understand the HTTP spec, multiple Content-Type headers are
illegal, and even if not, they certainly can cause trouble. I'm not sure
whether the cause for this on the apache.org site is the same as on my test
server, but the result is pretty much the same.



Re: META tags added as HTTP headers

2002-01-18 Thread Bill Moseley


At 01:20 AM 01/19/02 +0100, Markus Wichitill wrote:
which part of an Apache/mod_perl setup is responsible for extracting META
tags from generated HTML and adding them as HTTP headers (even with
PerlSendHeaders Off)?

That's lwp doing that, not Apache or mod_perl.

 HEAD http://www.apache.org
200 OK
Cache-Control: max-age=86400
Connection: close
Date: Sat, 19 Jan 2002 00:27:10 GMT
Accept-Ranges: bytes
Server: Apache/2.0.28 (Unix)
Content-Length: 7810
Content-Type: text/html
Expires: Sun, 20 Jan 2002 00:27:10 GMT
Client-Date: Sat, 19 Jan 2002 00:27:17 GMT
Client-Request-Num: 1
Client-Warning: LWP HTTP/1.1 support is experimental


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



  1   2   3   >