Re: apache and php

2007-07-03 Thread John ORourke

Jonathan Vanasco wrote:

i think the only way around it was to recompile everything from scratch.

generally speaking though... don't run mod_php and mod_perl on the 
same server.  you're just going to bloat apache and tie up resources.


I'm running both on Fedora Core 5 and 6, but I had to tweak it 
slightly.  I have a virtualhost which is fully under mod_perl's control 
but includes some php URLs:


In /etc/httpd/conf.d/php.conf:

uncomment AddType application/x-httpd-php .php

(this puts that mime type on .php files, which makes php process the 
file because the regular AddHandler method is ignored while SetHandler 
perl-script is in effect)


In httpd.conf:

VirtualHost *:80
   SetHandler perl-script
   
   Location /php-stuff
  SetHandler default  (I've tried with and without this, can work 
both ways)

   /Location
   
/VirtualHost


cheers
John



Re: questions on serving big file SQL statement parsing

2007-07-03 Thread Clinton Gormley
  in my case, i need to do authorization. do i need
  extra mod_perl front-end server to do this? how does
  this perform?

the beauty of mod_perl is that you can step in and out of the process
wherever you need to.  The down side is that mod_perl uses a lot of
memory, so you try to keep your mod_perl processes busy for as short a
time as possible, so that they can handle more requests.

The standard way of doing this is by having light weight apache-only
(or nginx or whatever takes your fancy) processes sitting in front of
your mod_perl servers.  They can handle all of the static files, and can
proxy the requests that need mod_perl to the backend mod_perl servers.

So for an image which needs to be authorized, you can use mod_perl to
handle the authorization phase, and then step out of the process to
allow Apache to serve the file itself.  Yes, this would mean hitting a
back end mod_perl server, but it would still be quick as the image
itself only has to be transferred to your proxy front end.

As far as performance goes, depends on how complicated your
authorization is. If you check the time of day, it'll be a lot quicker
than if you check 100 rows in the DB. However, the Perl code itself has
already been compiled into C and is fast. Not quite as fast as the
native apache handler, but still damn fast.

 
  also, will serving the file from backend mod_perl
  server to the front-end proxy a such bad idea? I am
  thinking that the modperl server will spit out the
  file to the proxy without have it tieup with the
  upload and let the proxy handle the file upload. am i
  right about this? ( couldn't find much detail on proxy
  to back it up at this moment )
 

 from what i recall, perlbal will do a subrequest to a backend modperl  
 server for auth, then handle the file upload itself.
 

Using perlbal is certainly one way to go.  
http://www.danga.com/perlbal/

It has worked very well for LiveJournal, but requires an extra degree of
logic planning to integrate it with your site. I would say keep it
simple.  Do the easy stuff first.  If your site is extremely busy and
the simple stuff STILL isn't enough, THEN look at solutions like
perlbal.

 you don't want to do any large file handling, or even general static  
 file handling, under mp if at all possible.  let it handle content  
 generation and authorization as the 'brains' -- thats what it does  
 best.  use other apps like perlbal, nginx, whatever to handle your  
 static files and large uploads.

This last statement I have a quibble about : using mod_perl to handle
file uploads. I may be wrong here, so I'd welcome reasoned disagreement,
but the way I understand it:

 - a proxy front end handles the slow upload of data from the client
 - it forwards it (internal network, thus fast) to your backend mod_perl
   server, which can then process it (eg add it to the DB, resize the 
   image, store it where it needs to be stored), and return the data 
   to the proxy.
 - the proxy sends the data back to the client across the internet,
   which is slow

So:
 - My image uploads require processing, so I need some form of Perl - I 
   already have that available in mod_perl.
 - The mod_perl processes aren't being kept tied up for a long period, 
   because the proxy is handling the slow bit: the transfer.
 - The only downside is that a large image upload plus processing
   could cause the size of the process to grow a lot, reducing the
   number of processes that you can run at once.
 - So use Apache2::SizeLimit to cull too-large processes, or
   if you know the request is a big one, cull the process manually
   by calling $r-child_terminate
 - Keep your upload processing mod_perls on one box, so that if you
   are flooded with uploads, just the upload functionality on your site
   is taken out, rather than your entire site


I'd be interested in hearing about other approaches.

Clint



Where to store uploads

2007-07-03 Thread Clinton Gormley
Following on from the thread questions on serving big file  SQL
statement parsing, I have a related question:

Where do you store your uploads?

I can think of two approaches:

1) In the DB, store the name of the server to which your file has been 
   uploaded

2) Store your upload in a shared partition (eg on a SAN, NFS, 
   iSCSI/OCFS2)

The advantage I see in the second approach is redundancy, the
disadvantage is that there will be a slight performance cost.

Anybody have recommendations/war-stories about these or other
approaches?

thanks

Clint



HTML::Stripscripts::LibXML (was Config::Loader and HTML::StripScripts)

2007-07-03 Thread Clinton Gormley
Kjetil Kjernsmo requested a front end to HTML::StripScripts that,
instead of returning HTML text, would return a LibXML Document or
DocumentFragment (ie a DOM tree).

I have released this as HTML::StripScripts::LibXML:
http://search.cpan.org/~drtech/HTML-StripScripts-LibXML-0.10/LibXML.pm

It handles messy HTML, strips out XSS, and gives you fine grained
control of the HTML/XML nodes that are returned.

If you are interested in this, please give it a try, and give me some
feedback about how to improve it, options to add etc.

The main question mark I have is what to do with encoding - suggestions
welcome.

Also see my question at Perl Monks:
http://www.perlmonks.org/index.pl?node_id=624334

thanks

Clint

On Tue, 2007-06-26 at 16:34 +0200, Kjetil Kjernsmo wrote:
 On Tuesday 26 June 2007 16:22, Clinton Gormley wrote:
   - used to strip XSS scripting from user submitted HTML
 
 Ooooh, cool! I haven't found any modules that does that well enough.
 
   - outputs valid HTML (cleans up nesting, context of tags etc)
 
   - handles the exploits listed at http://ha.ckers.org/xss.html
 
 
 Great!
 
  I hope this helps others, and if anybody has any suggestions, please
  feed them back to me
 
 Actually, something I would feel would be very useful is if it could 
 return an XML::LibXML::DocumentFragment object. 
 
 I tend to use XML::LibXML to parse user input and insert in the 
 document, which is then going through some XSLT, and since you've 
 allready parsed stuff, it seems like a waste to parse again.
 
 So that's my feature request! :-) 
 
 Cheers,
 
 Kjetil



Re: Where to store uploads

2007-07-03 Thread Foo JH
Depending on the number of files you're expecting, you may want to limit 
the number of files you put in a single folder. For example, in your 
shared folder you may want to create 26 subfolders - one for each letter 
of the alphabet. Then you drop the files in the subfolder matching the 
first letter of the filename. There's a bit of creative play on the 
subfolder 'hash', depending again on your expected filename format.


Clinton Gormley wrote:

Following on from the thread questions on serving big file  SQL
statement parsing, I have a related question:

Where do you store your uploads?

I can think of two approaches:

1) In the DB, store the name of the server to which your file has been 
   uploaded


2) Store your upload in a shared partition (eg on a SAN, NFS, 
   iSCSI/OCFS2)


The advantage I see in the second approach is redundancy, the
disadvantage is that there will be a slight performance cost.

Anybody have recommendations/war-stories about these or other
approaches?

thanks

Clint

  




Trying to install mod_perl 2.0 on SUSE Linux 2.6 (Dual Processor)

2007-07-03 Thread David Weintraub
I am attempting to install mod_perl on a SUSE Linux 2.6.6-7 dual
processor machine. It already has Apache 2.0.49 and Perl 5.8.3
installed. In fact, it already had perl_mod 1.x installed, but I decided
to install perl_mod 2.0.3.

I copied  mod_perl.so to /usr/lib64/apache2/mod_perl.so and made a
symbolic link to /usr/lib64/apache2-prefork/mod_perl.so. I modified
/etc/sysconfig/apache2 so that APACHE_MODULES includes perl.

I was able to successfully install Mason 1.36 and mod_perl 2.0.3, I
tried to go through the documentation, and found out about the renaming.
However, when I restarted Apache, I got the error that
/etc/apache2/mod_perl-startup.pl failed.

I eliminated the use Apache2 (); statement and changed all instances
of Apache to Apache2 (which I believe I was suppose to do). I then
had problems with the ENV statement, changed it to look for
$ENV(MOD_PERL), and tried Apache again, but failed because
$ENV{MOD_PERL} is not defined.

Here's the /etc/apache2/mod_perl-startup.pl I am using:


$ENV{MOD_PERL} =~ /^CGI-Perl/ or die MOD_PERL not used!;

#use Apache2 ();

use lib qw(/srv/www/perl-lib);

# enable if the mod_perl 1.0 compatibility is needed
# use Apache::compat ();

use ModPerl::Util (); #for CORE::GLOBAL::exit

#use ModPerl::RequestRec ();
#use ModPerl::RequestIO ();
#use ModPerl::RequestUtil ();
use Apache2::RequestRec ();
use Apache2::RequestIO ();
use Apache2::RequestUtil ();

#use Apache2::Server ();
use Apache2::ServerUtil ();
use Apache2::Connection ();
use Apache2::Log ();

use APR::Table ();

use ModPerl::Registry ();

use Apache2::Const -compile = ':common';
use APR::Const -compile = ':common';

1;

Not even too sure what else I need to add. In fact, I am not even too
sure how Apache starts up. There's two scripts /etc/init.d/apache and
/etc/init.d/apache2, but there is nothing in /etc/init.d/rc.3 that calls
either of those scripts.

--
David Weintraub
[EMAIL PROTECTED] 


Re: passing CGI paramters

2007-07-03 Thread Perrin Harkins

Hi,

There's no need to post your question four times.


I'm trying to bring my application up using ModPerl::PerlRun.  I have
anchors at places in my code like A CLASS='l2'
HREF='vumenu.cgi?str=$govlevel~$vufiles'  where I'm passing parameters to
the HREF.   I use the ?str part of HREF in other anchor invocations.
However the first anchor execution ?str definition persists so that the next
anchor executed using ?str has the ?str defintion of the first anchor
executed. I'm guessing that str is a global and persists.


I can't tell what you're doing from this description.  Can you show us
some of your code?


Additionally, if I execute the same anchor twice in a
row, changing the data the paramters are composed from between the 2
executions, the paramters do not change on the second execution under
PerlRun but do under CGI.


It sounds like your code has a scoping bug, probably an unintentional
closure.  Can you show us the code where you get the CGI parameters?

- Perrin


Re: Where to store uploads

2007-07-03 Thread Perrin Harkins

On 7/3/07, Clinton Gormley [EMAIL PROTECTED] wrote:

1) In the DB, store the name of the server to which your file has been
   uploaded


I try to avoid files in the DB.  It always ends in tears.


2) Store your upload in a shared partition (eg on a SAN, NFS,
   iSCSI/OCFS2)


That's ok if you need them on every server.  Many applications just
upload a file and process it on one server, so they don't need this.

- Perrin


Re: questions on serving big file SQL statement parsin

2007-07-03 Thread Perrin Harkins

On 7/2/07, James. L [EMAIL PROTECTED] wrote:

1. SQL statement parsing is mentioned in the doc:
http://perl.apache.org/docs/1.0/guide/performance.html#toc_Eliminating_SQL_Statement_Parsing

i am curious that if it is a general practice(caching
sql statement in package variable to avoid parsing) to
do the My::DB thing in mod_perl app ?


No, in most cases it's better to use prepare_cached instead.  It's a
little slower, but it can survive reconnecting the database handle.

- Perrin


Re: questions on serving big file SQL statement parsing

2007-07-03 Thread Perrin Harkins

On 7/2/07, Charlie Garrison [EMAIL PROTECTED] wrote:

mod_auth_tkt


That's what I use for this.

- Perrin


Re: Where to store uploads

2007-07-03 Thread Michael Peters
Clinton Gormley wrote:

 I can think of two approaches:

 1) In the DB, store the name of the server to which your file has been
uploaded

Don't do that. The moment you put a file into the database you loose all
of the nice tools that your OS gives you for working with files (grep, ls,
find, etc).  Plus when you send those files to the client you have to
query and stream it from the database instead of just using the
filesystem.

 2) Store your upload in a shared partition (eg on a SAN, NFS,
iSCSI/OCFS2)

If you need to access those files from multiple machines, than this is
much better than sticking them in the db.

 The advantage I see in the second approach is redundancy, the
 disadvantage is that there will be a slight performance cost.

Why would there by a higher performance cost? I can't imagine that a
shared filesystem would really have much more overhead than a database.

-- 
Michael Peters
Developer
Plus Three, LP


Re: Where to store uploads

2007-07-03 Thread Clinton Gormley
On Tue, 2007-07-03 at 10:26 -0400, Perrin Harkins wrote:
 On 7/3/07, Clinton Gormley [EMAIL PROTECTED] wrote:
  1) In the DB, store the name of the server to which your file has been
 uploaded
 
 I try to avoid files in the DB.  It always ends in tears.

Sorry - I meant, store this in the DB:
 - ID:  1234
 - type:image/jpeg
 - path:12/34/1234.jpg
 - server:  images1.domain.com

So that your program would construct a URL pointing to the correct
server, or a translation layer would forward the request to the correct
server

 
  2) Store your upload in a shared partition (eg on a SAN, NFS,
 iSCSI/OCFS2)
 
 That's ok if you need them on every server.  Many applications just
 upload a file and process it on one server, so they don't need this.

Sure - I was thinking primarily of image hosting.

thanks

Clint
 
 - Perrin



Re: questions on serving big file SQL statement parsing

2007-07-03 Thread Perrin Harkins

On 7/3/07, Clinton Gormley [EMAIL PROTECTED] wrote:

However, the Perl code itself has
already been compiled into C and is fast.


It's not a very important distinction, but perl code is not compiled
into C.  It gets compiled to an intermediary format of Perl opcodes,
which you can see with the B:: tools.


 - a proxy front end handles the slow upload of data from the client
 - it forwards it (internal network, thus fast) to your backend mod_perl
   server, which can then process it (eg add it to the DB, resize the
   image, store it where it needs to be stored), and return the data
   to the proxy.


I think this depends on your proxy implementation.  Some will buffer
it up front and some will send it exactly as they receive it, which
may be in chunks.

- Perrin


Re: Where to store uploads

2007-07-03 Thread Clinton Gormley
On Tue, 2007-07-03 at 10:26 -0400, Michael Peters wrote:
 Clinton Gormley wrote:
 
  I can think of two approaches:
 
  1) In the DB, store the name of the server to which your file has been
 uploaded
 
 Don't do that. The moment you put a file into the database you loose all
 of the nice tools that your OS gives you for working with files (grep, ls,
 find, etc).  Plus when you send those files to the client you have to
 query and stream it from the database instead of just using the
 filesystem.
 


I didn't realise that line was so confusing :)

I didn't mean: stick the file in the DB. 

I meant, stick the file into a directory on a particular machine, and
then put this into the DB:
Sorry - I meant, store this in the DB:
 - ID:  1234
 - type:image/jpeg
 - path:12/34/1234.jpg
 - server:  images1.domain.com  

So that your program would construct a URL pointing to the correct
server, or a translation layer would forward the request to the correct
server

Clint 



Re: questions on serving big file SQL statement parsing

2007-07-03 Thread Clinton Gormley
On Tue, 2007-07-03 at 10:37 -0400, Perrin Harkins wrote:
 On 7/3/07, Clinton Gormley [EMAIL PROTECTED] wrote:
  However, the Perl code itself has
  already been compiled into C and is fast.
 
 It's not a very important distinction, but perl code is not compiled
 into C.  It gets compiled to an intermediary format of Perl opcodes,
 which you can see with the B:: tools.

Apparently, those opcodes are C-structs, rather than an intermediate
bytecode:

http://perlmonks.org/?node_id=600141


Clint



Re: questions on serving big file SQL statement parsing

2007-07-03 Thread Perrin Harkins

On 7/3/07, Clinton Gormley [EMAIL PROTECTED] wrote:

 It's not a very important distinction, but perl code is not compiled
 into C.  It gets compiled to an intermediary format of Perl opcodes,
 which you can see with the B:: tools.

Apparently, those opcodes are C-structs, rather than an intermediate
bytecode:


Sure, they're C data structures becasue the perl interepreter is
written in C, but your original post makes it sound like they turn
into C code which presumably would get compiled to machine code like
other C code does.  Again, it's not an important distinction.  I just
didn't want it to sound like mod_perl does something magical that
differs from the normal perl interpreter behavior.

- Perrin


Re: Where to store uploads

2007-07-03 Thread Jonathan Vanasco


On Jul 3, 2007, at 10:38 AM, Clinton Gormley wrote:


I didn't mean: stick the file in the DB.

I meant, stick the file into a directory on a particular machine, and
then put this into the DB:
Sorry - I meant, store this in the DB:
 - ID:  1234
 - type:image/jpeg
 - path:12/34/1234.jpg
 - server:  images1.domain.com

So that your program would construct a URL pointing to the correct
server, or a translation layer would forward the request to the  
correct

server


i always do that... metadata in db and file on os

if you expect a large amount of files, you should do hash the file  
name and store in buckets


ie
$name= 'file'
$hash= md5($name);
	$path= sprintf( /%s/%s/%s/%s , $root, substr($name,0,2), $root,  
substr($name,2,4),$name );


you can't store by name alone because of character distribution among  
english words and numbers -- you'll end up with buckets that have 20k  
files and others that have 1.  md5 will give you a good distro in 32  
chars ( or bump to a higher base and show it in 16chars !)


if you're doing actual numbers, put in buckets working backwards --  
ie, by the power of 1,10,100 etc, and not reading frontwards.  you'll  
get more even distribution that way.


these are 2 mathematical principles... i can't remember their names.   
but they are good reads if you can find the names.  the irs uses the  
latter one for tax audits.


also keep in mind the os performance with files.  most os's do the  
best at ~1k files per directory; some are good up to 5k


md5 with base16 can give you a 3 deep directory  \d\d\d = 16*16*16 =  
4096 files
md5 with base32 can give you a 2 deep directory  \d\d = 32*32 = 1024  
files
md5 with base64 can give you a 2 deep directory  \d\d = 64*64 = 4096  
files


if only base32 were more common thats a good sweet spot.  for  
simplicity, i usually do 3 base16 chars.  but 2 base32 might be  
better for your os.



// Jonathan Vanasco

| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -  
- - - - - - - - - - - - - - - - - - -

|   CEO/Founder SyndiClick Networks
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -  
- - - - - - - - - - - - - - - - - - -

| Founder/CTO/CVO
|  FindMeOn.com - The cure for Multiple Web Personality Disorder
|  Web Identity Management and 3D Social Networking
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -  
- - - - - - - - - - - - - - - - - - -

|  RoadSound.com - Tools For Bands, Stuff For Fans
|  Collaborative Online Management And Syndication Tools
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -  
- - - - - - - - - - - - - - - - - - -





Re: questions on serving big file SQL statement parsing

2007-07-03 Thread Jonathan Vanasco



On Jul 3, 2007, at 4:02 AM, Clinton Gormley wrote:


This last statement I have a quibble about : using mod_perl to handle
file uploads. I may be wrong here, so I'd welcome reasoned  
disagreement,

but the way I understand it:


not all file uploads, but large ones.  anything over 100k i won't put  
onto mp.


 - My image uploads require processing, so I need some form of Perl  
- I

   already have that available in mod_perl.


i'm doing that now.  i'm migrating all the image processing to python  
though.  the image resizing generated by python is way better ( speed  
is slightly better in python, but i'm actually worried about  
quality.  i'm not too happy with the way Imager resizes stuff, too  
many artifacts in resize+jpeg compression.  gd+image magick worked  
poorly re: resources.)


to handle it, i'm doing thumbnails in perl as-created and flagging  
the image in the db with a 'not processed' bool.  a persistent  
process queries the db and resizes the images in python.




 - The mod_perl processes aren't being kept tied up for a long period,
   because the proxy is handling the slow bit: the transfer.


i've had proxies break send off chunks when i've updated daemons.   
its annoying.



 - The only downside is that a large image upload plus processing
   could cause the size of the process to grow a lot, reducing the
   number of processes that you can run at once.

=snip

 - Keep your upload processing mod_perls on one box, so that if you
   are flooded with uploads, just the upload functionality on your  
site

   is taken out, rather than your entire site


thats my biggest concern.  you could stay on the same box and just  
create a sep. pool of mp servers for the upload proxies though... any  
sort of processing done within mp is at the expense of other mp  
servers.  if your page generation time is an avg .08 seconds , and  
you just did an image upload + resize that took 3 seconds to process,  
you cost yourself a bunch of resources that mp could have handled.


so let me clarify my earlier statement-- you don't want your main mod- 
perl app handling the uploads.  if you can push it off onto another  
mp process/server, thats ok.  but if its on your main server, you're  
just creating eventual problems- your authorization and general  
content generation will be competing for resources with processes  
that can take several seconds.  bad idea.



// Jonathan Vanasco

| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -  
- - - - - - - - - - - - - - - - - - -

|   CEO/Founder SyndiClick Networks
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -  
- - - - - - - - - - - - - - - - - - -

| Founder/CTO/CVO
|  FindMeOn.com - The cure for Multiple Web Personality Disorder
|  Web Identity Management and 3D Social Networking
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -  
- - - - - - - - - - - - - - - - - - -

|  RoadSound.com - Tools For Bands, Stuff For Fans
|  Collaborative Online Management And Syndication Tools
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -  
- - - - - - - - - - - - - - - - - - -





Re: header issues etc...

2007-07-03 Thread Tyler Bird

Hi,

I jumped into the middle of this thread and it seems I am encountering a 
segfault in the header just like you have described.


Could you give me an overview of your solution since I do not have your 
first emails to this list.


Thanks

Tyler

pubert na wrote:
I fixed it... apparently it not like the return $self if defined 
$self; ...  return $class id ref $class is better form anyway... 
thanks and sorry about all the emails ;)


On 7/2/07, *pubert na* [EMAIL PROTECTED] 
mailto:[EMAIL PROTECTED] wrote:


I was able to narrow down the problem I was having, and produce a
test case for you folks. Below are two relatively self-explanatory
files.  If I navigate to test.cgi, it will appear to load the page
properly, but if I hit the reload button a bunch of times in a
row, the error_log will log segfaults.  I pasted a sample at the
bottom of this message.  I'd appreciate any help at all. 
Obviously the issue is coming from CGI.pm's header function, which

*should* be ok with mod_perl2.  Thanks again. --Pubert



##BEGIN FILE test.cgi

#!/usr/bin/perl 



use lib qw(/var/www/cgi-bin/);;

use strict;
use CGI;
use testclass;

my $cgi=new CGI;

my $tc = testclass-new({CGI=$cgi});
$tc-doit();


1;

##BEGIN FILE testclass.pm http://testclass.pm

package testclass;

my $self;

#   


sub new {
#   


# This is the
constructor.  



my ( $class, $args ) = @_;

return $self if defined $self;

$self = {};

bless $self, $class;

$self-{CGI}=$args-{CGI};

return $self;
}


sub doit{

print $self-{CGI}-header;
print hello;

}

1;

Begin error_log snipper
[Mon Jul 02 18:45:34 2007] [notice] SIGHUP received.  Attempting
to restart
[Mon Jul 02 18:45:34 2007] [notice] Digest: generating secret for
digest authentication ...
[Mon Jul 02 18:45:34 2007] [notice] Digest: done
[Mon Jul 02 18:45:34 2007] [notice] Apache/2.2.4 (Unix) DAV/2
mod_apreq2-20051231/2.6.1 mod_perl/2.0.2 Perl/v5.8.8 configured --
resuming normal operations
[Mon Jul 02 18:45:39 2007] [notice] child pid 7925 exit signal
Segmentation fault (11)
[Mon Jul 02 18:45:39 2007] [notice] child pid 7926 exit signal
Segmentation fault (11)
[Mon Jul 02 18:45:39 2007] [notice] child pid 7927 exit signal
Segmentation fault (11)
[Mon Jul 02 18:45:40 2007] [notice] child pid 7928 exit signal
Segmentation fault (11)
[Mon Jul 02 18:45:40 2007] [notice] child pid 7929 exit signal
Segmentation fault (11)
[Mon Jul 02 18:45:40 2007] [notice] child pid 7931 exit signal
Segmentation fault (11)






Re: header issues etc...

2007-07-03 Thread Perrin Harkins

On 7/2/07, pubert na [EMAIL PROTECTED] wrote:

my $self;

[...]

 sub doit{

print $self-{CGI}-header;
print hello;

}


This is bad.  You're using a variable that you didn't pass to doit().
That means you're creating a closure.  The sub will remember the $self
it saw when you first ran it, and it will never notice that $self
changes later.  You need to pass $self to this sub as a parameter in
order to make this work.

- Perrin


Re: Trying to install mod_perl 2.0 on SUSE Linux 2.6 (Dual Processor)

2007-07-03 Thread Rafael Caceres

On Tue, 2007-07-03 at 09:43 -0400, David Weintraub wrote:
 I am attempting to install mod_perl on a SUSE Linux 2.6.6-7 dual
 processor machine. It already has Apache 2.0.49 and Perl 5.8.3
 installed. In fact, it already had perl_mod 1.x installed, but I decided
 to install perl_mod 2.0.3.
 
The included perl with SLES is threaded, you probably want to build your
own perl. As for Apache 2, my advice is to also build it from source
yourself.

 I copied  mod_perl.so to /usr/lib64/apache2/mod_perl.so and made a
 symbolic link to /usr/lib64/apache2-prefork/mod_perl.so. I modified
 /etc/sysconfig/apache2 so that APACHE_MODULES includes perl.
 
SLES does not include/have the Apache2 source, so how did you build
mod_perl.so?

 I was able to successfully install Mason 1.36 and mod_perl 2.0.3, I
 tried to go through the documentation, and found out about the renaming.
 However, when I restarted Apache, I got the error that
 /etc/apache2/mod_perl-startup.pl failed.
 
 I eliminated the use Apache2 (); statement and changed all instances
 of Apache to Apache2 (which I believe I was suppose to do). I then
 had problems with the ENV statement, changed it to look for
 $ENV(MOD_PERL), and tried Apache again, but failed because
 $ENV{MOD_PERL} is not defined.
 
 Here's the /etc/apache2/mod_perl-startup.pl I am using:
 
 
 $ENV{MOD_PERL} =~ /^CGI-Perl/ or die MOD_PERL not used!;
 
 #use Apache2 ();
 
 use lib qw(/srv/www/perl-lib);
 
 # enable if the mod_perl 1.0 compatibility is needed
 # use Apache::compat ();
 
 use ModPerl::Util (); #for CORE::GLOBAL::exit
 
 #use ModPerl::RequestRec ();
 #use ModPerl::RequestIO ();
 #use ModPerl::RequestUtil ();
 use Apache2::RequestRec ();
 use Apache2::RequestIO ();
 use Apache2::RequestUtil ();
 
 #use Apache2::Server ();
 use Apache2::ServerUtil ();
 use Apache2::Connection ();
 use Apache2::Log ();
 
 use APR::Table ();
 
 use ModPerl::Registry ();
 
 use Apache2::Const -compile = ':common';
 use APR::Const -compile = ':common';
 
 1;
 
 Not even too sure what else I need to add. In fact, I am not even too
 sure how Apache starts up. There's two scripts /etc/init.d/apache and
 /etc/init.d/apache2, but there is nothing in /etc/init.d/rc.3 that calls
 either of those scripts.
 
 --
 David Weintraub
 [EMAIL PROTECTED] 

Regards,
Rafael Caceres



Re: passing CGI paramters

2007-07-03 Thread CraigT

Perrin,

Thanks for responding.  I'm not sure why it posted four times.   I'm new
here.   I posted the original request Sunday morning I think.I didn't
get any responses, so early this morning I registered as 'cliff'
with perl monks and made a similiar request.  

Clinton responded and worked with me for about an hour in the chatterbox.   
I put a lot of stuff in my (cliff) scratchpad like the ENV values, the
relevant Apache httpd entries, a dump, the Apache error log, and code
examples.   Would it be possible for you to review the stuff I put there?

Clinton didn't spot anything.  But the MOD_PERL ENV variable is not getting
set and this variable controls the release of globals in the beginning of
CGI.pm.   I don't know how this variable gets established, but I think its
an important clue.

My environment is Windows XP Home/Apache 2/Perl 5.8/mod_perl 2.   I can
execute my application under PerlRun (perl-run).  It executes.   But when I
click on anchors that pass CGI paramters using the same ?str variable to
pass the paramters, the parameter values passed in ?str when I execute the
first anchor (regardless of which one I execute first) persist.   These
values get passed to subsequent anchor executions passing the paramters
values in ?str.  The initial value of ?str persists until I recycle the
server.
The application does execute and operate correctly under CGI.

The 2 URLs for these execution environments are:
http://steepusa.no-ip.info/cgi-bin/m3.cgi
http://steepusa.no-ip.info/perl-run/m3.cgi 

I use strict everywhere and all Perl variables are declared with my.   I
have verified that the values of the varaiables I assign to ?str in the
anchors are correct in all anchor executions.

Thanks again Perrin.   I appreciate any help.

CraigT (or Cliff in Perl Monks)



Perrin Harkins wrote:
 
 Hi,
 
 There's no need to post your question four times.
 
 I'm trying to bring my application up using ModPerl::PerlRun.  I have
 anchors at places in my code like  
 HREF='vumenu.cgi?str=$govlevel~$vufiles'  where I'm passing parameters to
 the HREF.   I use the ?str part of HREF in other anchor invocations.
 However the first anchor execution ?str definition persists so that the
 next
 anchor executed using ?str has the ?str defintion of the first anchor
 executed. I'm guessing that str is a global and persists.
 
 I can't tell what you're doing from this description.  Can you show us
 some of your code?
 
 Additionally, if I execute the same anchor twice in a
 row, changing the data the paramters are composed from between the 2
 executions, the paramters do not change on the second execution under
 PerlRun but do under CGI.
 
 It sounds like your code has a scoping bug, probably an unintentional
 closure.  Can you show us the code where you get the CGI parameters?
 
 - Perrin
 
 

-- 
View this message in context: 
http://www.nabble.com/passing-CGI-paramters-tf4008753.html#a11416640
Sent from the mod_perl - General mailing list archive at Nabble.com.



Re: header issues etc...

2007-07-03 Thread Perrin Harkins

[ Please keep it on the list ]

On 7/3/07, pubert na [EMAIL PROTECTED] wrote:

The app I'm working with uses this as a method for object b to retrieve the
instance of object a,  which created it.

i.e.  An object x, creates 4 objects, a,b,c, and d, then calls a method in
object a.  Object a needs a reference to object b so it calls
$foo=CLASSX::getInstance() which returns $self.  Object a can then call
$foo-getObjectB and it has what it wants.


I don't really understand this description.  If you're trying to code
a singleton pattern, use global variables to hold the object.  That
makes it clearer what your intent is.


Apparently this does not work at all under mod_perl.  The only fix I can
think of is passing the instantiating object as a parameter when the object
is created.  What do you guys usually do?


Scoping works the same as usual under mod_perl.  If you need access to
object instances, you can use a singleton pattern, storing the objects
in global variables, or you can pass the instances to the sub that
needs to use them.

- Perrin


Re: passing CGI paramters

2007-07-03 Thread Perrin Harkins

On 7/3/07, CraigT [EMAIL PROTECTED] wrote:

I put a lot of stuff in my (cliff) scratchpad like the ENV values, the
relevant Apache httpd entries, a dump, the Apache error log, and code
examples.   Would it be possible for you to review the stuff I put there?


Maybe later.  Do you have a link for it?


Clinton didn't spot anything.  But the MOD_PERL ENV variable is not getting
set and this variable controls the release of globals in the beginning of
CGI.pm.   I don't know how this variable gets established, but I think its
an important clue.


If $ENV{'MOD_PERL'} is empty, you are running under CGI, not mod_perl.
Maybe you have a typo in your code that checks it?

- Perrin


Re: header issues etc...

2007-07-03 Thread Jonathan Vanasco


On Jul 3, 2007, at 5:51 PM, Perrin Harkins wrote:


I don't really understand this description.  If you're trying to code
a singleton pattern, use global variables to hold the object.  That
makes it clearer what your intent is.



Scoping works the same as usual under mod_perl.  If you need access to
object instances, you can use a singleton pattern, storing the objects
in global variables, or you can pass the instances to the sub that
needs to use them.


i prefer storing them as class variables and using a public method to  
provide access


ie:

package myfactory;
my $object= object-new();
sub get_object { return $object ;}
my %objects= (
'a'= object-new(),
)
	sub get_object_hash { my ( $class, $flavor )= @_; return $objects 
{$flavor} ;}


package myapp;
my $object= myfactory-get_object();
my $object_a= myfactory-get_object('a');


i can't remember if the $class is necessary or not.  i'm responding  
via my mobile :)




Re: passing CGI paramters

2007-07-03 Thread Perrin Harkins

[Please keep it on the list]

On 7/3/07, Craig Tussey [EMAIL PROTECTED] wrote:

Thanks again for responding.  Here is the link to my
scratchpad.   Keep in mind that I was making entries
to the scratchpad in response to Clintons questions.

http://www.perlmonks.org/?viewmode=public;node_id=624649


Okay, you really aren't showing much code here.  From what you did
show, I'm guessing you're doing this:

my $govlevel = $page-param(str);
print_html();

sub print_html {
   print blah blah $govlevel blah blah;
}

That doesn't work because the sub becomes a closure and never notices
when you create new versions of $govlevel.  The fix is to pass
$govlevel to the sub as a parameter.

- Perrin