Re: timing downloads with mod_perl

2002-11-18 Thread Alvar Freude
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1



- -- Nick [EMAIL PROTECTED] wrote:

 I would like to log the time it take users to download
 items from my site.  Has anyone has any success
 writing a mod perl prog to accomplish this?

create two handlers:

  - One in PerlInithandler
  - The second in PerlLogHandler 

(or similar)


Use Time::HiRes to get the time in high resolution

In InitHandler: get the time the request starts.

In LogHandler: look how many time the request took.


But normaly your server's OS caches the output, so this might be only a
hint for big files or the time taken by the request.



For complete HTML pages you can write some JavaScript ...

HTH,
  Alvar


- -- 
** Alvar C.H. Freude
** http://alvar.a-blast.org/   == NEU!
**
** http://odem.org/
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.2.1 (FreeBSD)

iD8DBQE92R9lOndlH63J86wRAgjaAJ9RN6BISP5lf0QxFWYj8qLh7en/jQCgzNNE
Qds7SV+WQ/I0giIfi00a8lA=
=brMe
-END PGP SIGNATURE-




Re: eating memory on FreeBSD

2002-06-05 Thread Alvar Freude

Hi,

-- Doug MacEachern [EMAIL PROTECTED] wrote:

 dso should be fine with 1.26 or 1.27, provided you are using Perl 5.6.1
 or  higher.  5.005_03 still has leakage.

no, as mentioned some months ago in the list, on FreeBSD with mod_perl 1.26
as DSO and Perl 5.6.1 there are big leaks: every mod_perl which should be
shared between the forked shilds seems to be allocated new (and additional)
every restart (including graceful) again and at startup twice.

Tested with:

  - Manual build perl 5.6.1 from sources, mod_perl 1.26 from sources as
DSO, 
Apache from the ports.
  
  - Perl 5.6.1 from ports, mod_perl 1.26 (DSO) from ports, Apache from ports

  - Perl 5.005... from FreeBSD base system and all from ports.


All with different options, threads off, usemymalloc off and on, ... 


I didn't found any configuration, with which it worked and doesn't eat all
the Memory which should be shared every (re)start.


If I can help in fixing this bug, I'll do this! :-)


Ciao
  Alvar

-- 
// Stop Censorship! http://www.odem.org/informationsfreiheit/en/
// Filter your internet http://www.odem.org/insert_coin/imkp2001.html
// Blast!   http://www.a-blast.org/
// Alvar:   http://alvar.a-blast.org/





Re: mod_perl netscape problem

2002-06-04 Thread Alvar Freude

Hi,

-- Iyengar Yoga Resources [EMAIL PROTECTED] wrote:

 Pages that go through the mod_perl server have 'strange' strings at the 
 top and the bottom, when you view them with Netscape 4.x (any OS, it 
 seems). Also, the page does not stop loading. This does not happen with 
 other browsers (Mozilla 1RC2, IE 5, Opera 6). An example can be found at 

beside the already described Proxy bug i realized another Bug in Netscape:


If you didn't print out the Content-Length header, Netscape 4.x destroys
the page, if there are some JavaScripts (at least with document.write)
inside.

Including Content-Length in the headers fixes this.


Ciao
  Alvar

-- 
// Unterschreiben!  http://www.odem.org/informationsfreiheit/
// Internet am Telefon: http://www.teletrust.info/
// Das freieste Medium? http://www.odem.org/insert_coin/
// Blaster: http://www.assoziations-blaster.de/




eating memory ... // RE: Porting to OS X

2002-06-04 Thread Alvar Freude

Hi,

-- Michael Robinton [EMAIL PROTECTED] wrote:

 application, which runs on an aging 486 with 64 megs in our shop and uses
 about 4 megs including mod_perl enhanced apache, took 40 megs on OSX and
 was very slow. This was on a G4 with 500 megs of memory.

probably it's the same as on FreeBSD: if you use a DSO mod_perl, for each
restart (apachectl graceful or apachectl restart) it eats all the memory
your mod_perl modules use. Try to build it statically; at least on FreeBSD
it helps, and OSX is FreeBSD ... :-)

But my newest test was with Apache 1.3.23 and mod_perl 1.26; perhaps it's
fixed in 1.27?!? 

But nevertheless 4 MB is very small; my frontend Apache 1.3.23 without
mod_perl takes 3 MB; my frontend Apache 2.0.36 on developing system 4 MB
without mod_perl ...


Ciao
  Alvar

-- 
// Unterschreiben!  http://www.odem.org/informationsfreiheit/
// Internet am Telefon: http://www.teletrust.info/
// Das freieste Medium? http://www.odem.org/insert_coin/
// Blaster: http://www.assoziations-blaster.de/




Re: mod_perl netscape problem

2002-06-03 Thread Alvar Freude

Hi,

-- Iyengar Yoga Resources [EMAIL PROTECTED] wrote:

 Pages that go through the mod_perl server have 'strange' strings at the 
 top and the bottom, when you view them with Netscape 4.x (any OS, it 
 seems). Also, the page does not stop loading. This does not happen with 
 other browsers (Mozilla 1RC2, IE 5, Opera 6). An example can be found at 

beside the already described Proxy bug i realized the following bug in
Netscape:


If you didn't print out the Content-Length header, Netscape 4.x destroys
the page, if there are some JavaScripts (at least with document.write)
inside: the document.write goes to a nearly random part of the HTML file(!)
and often breaks the code.

Including Content-Length in the headers fixes this.


Ciao
  Alvar

-- 
// Unterschreiben!  http://www.odem.org/informationsfreiheit/
// Internet am Telefon: http://www.teletrust.info/
// Das freieste Medium? http://www.odem.org/insert_coin/
// Blaster: http://www.assoziations-blaster.de/




Re: rfc Apache::Dynagzip

2002-06-03 Thread Alvar Freude

Hello,

-- Slava Bizyayev [EMAIL PROTECTED] wrote:

 Finally, I'm going to upload the Apache::Dynagzip 0.03 to CPAN by the
 evening of June 4-th. I'm not sure about the quality of my documentation
 yet. 

the documentation looks very complete :-)


Do you provide also an interface as replacement for $r-print(Bla foobar)?
This might be useful for mod_perl applications which want all the control
over the output and no second output handler ...



[from the documentation:]

 This type of compression is applied when the client is recognized as
 being able to decompress gzip format on the fly. In this version the
 decision is under the control of whether the client sends the
 Accept-Encoding: gzip HTTP header, or not. (Please, let me know if you
 have better idea about that...)

The last years I experimented a lot with gzipped output and wrote my own
gzipping output function (but without chunked encoding :-) ). I realized a
lot of bugs with some different browsers under some environments or
situations. I found no remarks according this in your documentation, so
perhaps you missed something? 
(if I missed something, feel free to write ;-) )

So, e.g. I created a website with a big server generated flash movie (with
Flash generator, mod_perl, Postgres; yep, Flash sucks!), and Netscape
doesn't decompress the output, but claims to understand gzip. AARRGH!

Also compressed output does not work on IE, if the request was POSTed (only
tested with older version, I gess 4.0, of IE).


So here is the part out my output function with the gzip stuff:


=item gzip encoding in output

gzip compression is enabled when:
   
   - browser acceps gzip
   - or browser is an Bugscape 4.x on X11 or Mac


but gzip is DISABLED when:

   - method is not GET:
 at least on some versions of IE (4.x), compressed pages didn't work
 together with POST!
 
   - every image/* content type needs no gzip (at least GIF and JPEG ...)
   
   - browser is not compatible and Content-Type is not text/html,
 even if browser said that he understand gzip
 At least Bugscape 4.x didn't decompress embedded objects like
 flash movies
   
   - if the flag $out-{nogzip} is true, disable gzip ...


 (AUTHOR: Alvar Freude, [EMAIL PROTECTED], http://alvar.a-blast.org/)
=cut
   
   
   # Important: we change output
   # according to this headers
   $r-header_out('Vary', 'Accept-Encoding, User-Agent');

   # Cache this header line ...
   my $encoding = $r-header_in(Accept-Encoding);
   
   # Lets Start the if monster ...
   
   if (!$out-{nogzip}   # only without nogzip-flag
  ($r-method() eq 'GET')# and with GET Method
  ($browser =~ /compatible/i ? # is it an compatible browser?
 $content_type !~ /^image/ :   #   then compress unless images
 $content_type eq text/html  #   else only gzip text/html
  )  # and understand the browser gzip?
   # Version 1: he said yes
  (defined($encoding)  index($encoding, gzip) = 0 ||
  ($browser =~ m{  # version 2:
  ^Mozilla/# it is a Mozilla 4.x ...
  \d+  
  \.
  (\d+)
  [\s\[\]\w\-]+
  (
   \(X11 | # On X11
   Macint.+PPC,\sNav   # Or Mac/PPC
  )
 }x  $1  $1 == 4  $browser !~ /compatible/i)
 ))
  {
  
  $r-content_encoding('gzip');# Compress it with GZIP
  ${ $out-{document} } = Compress::Zlib::memGzip($out-{document});
  }
  
   # don't forget Content-Length!
   # it enables keepalive and disables
   # some Netscape bugs
   $r-header_out('Content-Length', length(${ $out-{document} }));
   
   $r-send_http_header;   # Now we can send the headers ...
   
   # And the Data, if needed.
   $r-print(${ $out-{document} }) unless $r-header_only;
   $r-rflush();   # Flush all, so we can do additional
   # stuff like mail delivery and all
   # data is sent out


Ciao
  Alvar



-- 
// Unterschreiben!  http://www.odem.org/informationsfreiheit/
// Internet am Telefon: http://www.teletrust.info/
// Das freieste Medium? http://www.odem.org/insert_coin/
// Blaster: http://www.assoziations-blaster.de/




Again DSO-mod_perl and leaking on restart: FreeBSD

2001-10-25 Thread Alvar Freude

Hi,

some weeks ago here was the discussion about DSO-mod_perl and leaking; as
far as I understand it the solution was, that on SOME platforms (solaris)
mod_perl should be build as static.


It seems that FreeBSD is such a platform ;-)


I build Apache from the Ports-Collection with mod-ssl --
/usr/ports/www/apache13-modssl -- (with 1.3.20 and 1.3.22) and manually
Perl 5.6.1 (NO mymalloc, no compatibility, with mymalloc also tested) and
mod_perl 1.26 as DSO with APXS.


My mod_perl-Application has a global and resident hash, about 18 MB, which
is build in the startup-script.

At normal startup, it seems that mod_perl requires TWICE the amount of this
hash, and each restart (also graceful) it eats ~18 MB. 1 GB is fast full ;-)


The mod_perl from Ports Collection is build also as DSO and has the same
problem (with old Perl replaced by 5.6.1).


Has someone mod_perl on FreeBSD WITH DSO and without leaking running? If
not possible it might be good to include a warning in the Makefiles or
mod_perl-Guide etc. 

I prefer to build Apache from the Ports collection, so at next I'll try to
create a Port with static mod_perl ...


Ciao
  Alvar


PS:

#perl -V
Summary of my perl5 (revision 5.0 version 6 subversion 1) configuration:
  Platform:
osname=freebsd, osvers=4.4-stable, archname=i386-freebsd
uname='freebsd gnarzelwicht.delirium-arts.de 4.4-stable freebsd
4.4-stable #8: sat oct 6 13:55:41 cest 2001
[EMAIL PROTECTED]:usrobjusrsrcsysmykernel2 i386 '
config_args=''
hint=recommended, useposix=true, d_sigaction=define
usethreads=undef use5005threads=undef useithreads=undef
usemultiplicity=undef
useperlio=undef d_sfio=undef uselargefiles=define usesocks=undef
use64bitint=undef use64bitall=undef uselongdouble=undef
  Compiler:
cc='cc', ccflags ='-fno-strict-aliasing -I/usr/local/include',
optimize='-O3 -march=k6 -funroll-loops -fexpensive-optimizations
-malign-double',
cppflags='-fno-strict-aliasing -I/usr/local/include'
ccversion='', gccversion='2.95.3 20010315 (release) [FreeBSD]',
gccosandvers=''
intsize=4, longsize=4, ptrsize=4, doublesize=8, byteorder=1234
d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=12
ivtype='long', ivsize=4, nvtype='double', nvsize=8, Off_t='off_t',
lseeksize=8
alignbytes=8, usemymalloc=n, prototype=define
  Linker and Libraries:
ld='cc', ldflags ='-Wl,-E  -L/usr/local/lib'
libpth=/usr/lib /usr/local/lib
libs=-lgdbm -ldb -lm -lc -lcrypt -liconv -lutil
perllibs=-lm -lc -lcrypt -liconv -lutil
libc=, so=so, useshrplib=false, libperl=libperl.a
  Dynamic Linking:
dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags=' '
cccdlflags='-DPIC -fpic', lddlflags='-shared  -L/usr/local/lib'
 
 
Characteristics of this binary (from libperl):
  Compile-time options: USE_LARGE_FILES
  Built under freebsd
  Compiled at Oct 23 2001 17:37:29
  @INC:
/usr/lib/perl5/5.6.1/i386-freebsd
/usr/lib/perl5/5.6.1
/usr/lib/perl5/site_perl/5.6.1/i386-freebsd
/usr/lib/perl5/site_perl/5.6.1
/usr/lib/perl5/site_perl
.




Re: Again DSO-mod_perl and leaking on restart: FreeBSD

2001-10-25 Thread Alvar Freude

Hi,

 It would be interesting to hear whether you have the same results
 with the stock Perl version distributed with FreeBSD (built without
 PERL_THREADED defined, since that experimental stuff was removed from
 FreeBSD with good reason).

with perl 5.6.1 and my Perl isn't threaded :)


 That's what the p5-apache port did. 
 I added the mod_perl port
 _specifically_ to make mod_perl available to FreeBSD admins dynamically.

but it's build as DSO and I guess this makes the troubles.

i prefer DSO and ports too, but i guess that DSO is the problem.

Also the standard perl of FreeBSD is 5.005 -- which is really leaking
memory and not up to date ;=) so i want to use 5.6.1 -- on my test machine
i replaced it with 5.6.1, on production server i installed a second perl
because it seems that it was not the best idea to replace the standard perl
version ...
 

Ciao
  Alvar




Re: Again DSO-mod_perl and leaking on restart: FreeBSD

2001-10-25 Thread Alvar Freude

Hi,

 It kinda feels like you didn't absorb any of the contents of my reply.
 :-)

ehm, ups sorry -- perhaps i misunderstand you ;)
(oh yes, my english is bad :( )

 
 Let me try a different approach.  What do you want me to do?

it was mainly a message to the mod_perl mailinglist, CC to you because you
maintain the ports and maybe heard about the problem.

about two months ago in the list there was a discussion about building
mod_perl as DSO and the result was, that on some platforms it leaks memory
if build as DSO.



Ciao
  Alvar




Re: Slow $r-print

2001-02-12 Thread Alvar Freude


Erdmut Pfeifer wrote:
 
 do you get a high CPU load during those 4 seconds?

NO! Even if i request 10 documents at the same time, there is no
significant load (except the Postgres load and some DBI stuff in the
beginning before printing).


hmmm!


:(


Ciao
  Alvar



Re: Slow $r-print

2001-02-12 Thread Alvar Freude

Hi,

Michael Bacarella wrote:
 
 Hmmm, slow name resolution?

no, because the time is dependent on the length of the document!


Ciao
  Alvar

 
 When I get really stumped, I whip out strace/ktrace.
 
 I don't know how I used to get along without it. It's time consuming,
 but 95% of the time it will tell you exactly what you need to know.

hmmm, with mod_perl?


Ciao
  Alvar



Slow $r-print

2001-02-11 Thread Alvar Freude

Hi,

I recognized, that the output of (my) mod_perl Scripts is very slow, if
there are longer texts.

I use the following Code on a Linux machine (PII 350, 320 MB), Apache
1.3.12, mod_perl 1.23:


  $r-header_out('Content-Length', length($$textref));
  $r-header_out('Connection', 'close');
  $r-send_http_header;
  
  $r-print($$textref) unless $r-header_only;
  $r-rflush();


this tooks for an 200 KByte Text about 4 seconds from localhost, and the
same as CGI or plain HTML is as fast as expected, so this seems to be an
mod_perl problem.

If the output is GZIP-Compressed, it is much more faster (also from
localhost) then uncompressed, despite the extra time for compression.


You can check it here:
http://www.assoziations-blaster.de:7000/forum/forum-list_0.html

(compressed, if browser supports it).


The calculating time of the hole tree is about 0.25 seconds, the other
time is mod_perls print ...


Has someone an idea?



Thanks and Ciao
  Alvar


-- 
Alvar C.H. Freude  |  [EMAIL PROTECTED]

Demo: http://www.online-demonstration.org/  |  Mach mit!
Blast-DE: http://www.assoziations-blaster.de/   |  Blast-Dich-Fit
Blast-EN: http://www.a-blast.org/   |  Blast/english



Re: Slow $r-print

2001-02-11 Thread Alvar Freude

Hi,


Matt Sergeant wrote:
 
$r-header_out('Content-Length', length($$textref));
$r-header_out('Connection', 'close');
$r-send_http_header;
 
$r-print($$textref) unless $r-header_only;
 
 FWIW, you can pass in just $textref and print does the wrong thing. Err, I
 mean right thing... well that's another debate :-)

uups -- I had in mind, that this doesn't work -- but perhaps this was an
older version ;)

 
$r-rflush();
 
 Why do you feel the need to flush?

There is some more stuff like logging in my module, so with this all
buffers should be sent before finishing Logging, sending mails (in
future versions) or whatelse.


Ciao
  Alvar


-- 
Alvar C.H. Freude  |  [EMAIL PROTECTED]

Demo: http://www.online-demonstration.org/  |  Mach mit!
Blast-DE: http://www.assoziations-blaster.de/   |  Blast-Dich-Fit
Blast-EN: http://www.a-blast.org/   |  Blast/english



u-1-20.karlsruhe.ipdial.viaginterkom.de, Re: Slow $r-print

2001-02-11 Thread Alvar Freude


 # tail debug/2001_02_12.log 

62.180.20.1 - - [12/Feb/2001:00:58:01 +0100] "" 200
3262 "-" "-" 1
62.180.20.1 - - [12/Feb/2001:00:58:30 +0100] "HEAD
/forum/forum-list_0.html HTTP/1.0" 404 0 "-" "-" 4
[...]

62.180.20.1 == u-1-20.karlsruhe.ipdial.viaginterkom.de


h ... Nice ...


Ciao
  Alvar

PS:
URL posted only here ... Are such attempts normal?



Re: Slow $r-print

2001-02-11 Thread Alvar Freude



Erdmut Pfeifer schrieb:
 
 I just tried it a couple of times with wget. I always got something
 between 150-190KB/sec -- doesn't seem too slow to me :)
 
 $ wget http://www.assoziations-blaster.de:7000/forum/forum-list_0.html
 --01:15:32--  http://www.assoziations-blaster.de:7000/forum/forum-list_0.html
= `forum-list_0.html'
 Connecting to www.assoziations-blaster.de:7000... connected!
 HTTP request sent, awaiting response... 200 OK
 Length: 182,296 [text/html]
 
 0K - .. .. .. .. .. [ 28%]
50K - .. .. .. .. .. [ 56%]
   100K - .. .. .. .. .. [ 84%]
   150K - .. ..  [100%]

yes -- but it seems that wget waits for the response before counting ;-)

200K is OK, 2 MBit line.



Apache Bench:

 #ab http://www.assoziations-blaster.de:7000/forum/forum-list_0.html
This is ApacheBench, Version 1.3c $Revision: 1.38 $ apache-1.3
Copyright (c) 1996 Adam Twiss, Zeus Technology Ltd,
http://www.zeustech.net/
Copyright (c) 1998-1999 The Apache Group, http://www.apache.org/

Server Software:   
Apache/1.3.12  
Server Hostname:www.assoziations-blaster.de
Server Port:7000

Document Path:  /forum/forum-list_0.html
Document Length:182296 bytes

Concurrency Level:  1
Time taken for tests:   3.266 seconds
Complete requests:  1
Failed requests:0
Total transferred:  182634 bytes
HTML transferred:   182296 bytes
Requests per second:0.31
Transfer rate:  55.92 kb/s received



So, hmmm, this looks like output starts after ~4 seconds ...


If I print only "hello world" but with building the same HTML, the time
is about 0.025 seconds! Very strange!


Ciao
  Alvar



Re: u-1-20.karlsruhe.ipdial.viaginterkom.de, Re: Slow $r-print

2001-02-11 Thread Alvar Freude



Robin Berjon wrote:
  
 That looks like someone testing your problem on telnet and getting it wrong
 the first time. Probably not a cause for worry.

oh yes, sorry!

I'm a little bit paranoid, because i had some cracking attempts the last
time :(


Ciao 
  Alvar


-- 
Alvar C.H. Freude  |  [EMAIL PROTECTED]

Demo: http://www.online-demonstration.org/  |  Mach mit!
Blast-DE: http://www.assoziations-blaster.de/   |  Blast-Dich-Fit
Blast-EN: http://www.a-blast.org/   |  Blast/english



site using mod_perl

2000-10-23 Thread Alvar Freude

Hi,

I use mod_perl on http://www.a-blast.org/ and want to suggest it for
http://perl.apache.org/sites.html

It is a "truly interactive text network", written completely in
mod_perl. For a quick, non-technical overview have a look on
http://www.assoziations-blaster.de/prixars/ (its in english on our old
domain).

About one year ago, it runs on M$ IIS with ActivePerl and some PHP, in
the meantime it is completely rewritten as Apache module, using MySQL as
database. With this, I speed up the execution time from ~3 Seconds to
~10 milliseconds for each Blast-Page (OK, OK, the old machine had a very
worst hardware, now we use only a semi-worst one: Pentium II 350, 320 MB
RAM with Soft-RAID 0 under Linux).


The blast_engine includes the links into the texts in realtime, also the
statistics are created in realtime:

  http://www.a-blast.org/statistics/
  http://www.assoziations-blaster.de/statistik/ 
(german, with much more traffic)


The blaster uses the speed benefit of keeping the complete keyword list
in memory (more then 5 MB for the german version), for the non-linear
real-time linker I use a ~50 line regexp .-)
The HTML-Files are compressed on-the-fly with Compress::Zlib, so we keep
bandwidth (and transmission time to the users) small.

so, hmmm, if you have any (technical?) questions, feel free to ask ;)


The german version has about 5000 visitors and 2 page views per day,
which is very good for a non-profit net.art project in germany.


Ciao
  Alvar

-- 
Alvar C.H. Freude  |  [EMAIL PROTECTED]

Demo: http://www.online-demonstration.org/  |  Mach mit!
Blast-DE: http://www.assoziations-blaster.de/   |  Blast-Dich-Fit
Blast-EN: http://www.a-blast.org/   |  Blast/english



Re: Filtering HTML files with mod_proxy and mod_perl

2000-07-19 Thread Alvar Freude

Hi Wim,

 I've created something like this.
 
 I've attached the script I used to build mod_proxy and mod_perl, and a short
 Apache::MyFilter to show how to use this.  Note: I've cut down the handler from
 my version without really testing it, so it may have a couple syntax errors.

thanx!
But ... I think it doesn't work in my case, because I have to change the
HTML-content itself.

Or do you get somewhere the plain HTML-Content of the final
HTTP-Request? If yes this part is missing in the example! ;)


Ciao
  Alvar

-- 
Alvar C.H. Freude  |  [EMAIL PROTECTED]

Demo: http://www.online-demonstration.org/  |  Mach mit!
Blast-DE: http://www.assoziations-blaster.de/   |  Blast-Dich-Fit
Blast-EN: http://www.a-blast.org/   |  Blast/english



Re: Filtering HTML files with mod_proxy and mod_perl

2000-07-19 Thread Alvar Freude

Hi,

 This is what mod_proxy does on its own, no mod_perl needed.

including filtering?

 
 If you wanted to do it in "pure" mod_perl (no mod_proxy), write a
 TransHandler similar to the ones listed in chapter 7 of the Eagle book,
 pp 368 - 381 (pp 372 - 373, for example, is an anonymoizing proxy, and
 pp 374 - 381 is an ad-blocking proxy). This chapter is available on the
 web in its entirety at http://www.modperl.com/book/chapters/ch7.html.
 
 Pretty simple, all in all.

yes, I think this is ECAXTLY this what I an searching for!

thanks!


Ciao
  Alvar

-- 
Alvar C.H. Freude  |  [EMAIL PROTECTED]

Demo: http://www.online-demonstration.org/  |  Mach mit!
Blast-DE: http://www.assoziations-blaster.de/   |  Blast-Dich-Fit
Blast-EN: http://www.a-blast.org/   |  Blast/english



Re: Filtering HTML files with mod_proxy and mod_perl

2000-07-19 Thread Alvar Freude

Hi,

 If you find a way to do it with Apache::Proxy, let the list know.

I am sure it will work with the example given by Darren.

If i checked it I think I'll create a small module and can spread it.


 One of the major reasons I went this route over something like the examples in
 the mod_perl book, was speed.  Downloading big files using the examples book
 was slow, as apache first gathers the content up into a variable (where you can
 do your regular expressions or whatever manipulating), then sent it to the
 browser.  You would need a lot of memory in this situation.

yes, but if you use a subroutine which handles the incoming chunks, you
can pass the file emmediatly. See
http://theoryx5.uwinnipeg.ca/CPAN/data/libwww-perl/lwpcook.html at the
bottom :)


Ciao
  Alvar

-- 
Alvar C.H. Freude  |  [EMAIL PROTECTED]

Demo: http://www.online-demonstration.org/  |  Mach mit!
Blast-DE: http://www.assoziations-blaster.de/   |  Blast-Dich-Fit
Blast-EN: http://www.a-blast.org/   |  Blast/english



Filtering HTML files with mod_proxy and mod_perl

2000-07-18 Thread Alvar Freude

Hi,

I want to create a service which filters HTML files like this: 
http://www.a-blast.org/web-blast.html == 
http://www.a-blast.org/web-blast.plx?url=http://www.nsa.gov/programs/employ/


But it should go through a proxy, you don't need to access another site
and all filtering works transparent in background, within the proxy.


So my idea was to make this with mod_proxy and mod_perl, but I did not
found any documentation to do this.

The user should enter a proxy in his Browser config, e.g.
superproxy2000.here.org:, and after this he can surf through the web
and gets filtered files.

Is this possible with mod_perl and md_proxy?


And if yes: where I can find documentation for this?

Or has somebody code snippets? ;)


Thanx and Ciao

  Alvar


-- 
Alvar C.H. Freude  |  [EMAIL PROTECTED]

Demo: http://www.online-demonstration.org/   | Mach mit!
Blast-DE: http://www.assoziations-blaster.de/| Blast-Dich-Fit
Blast-EN: http://www.a-blast.org/| Blast/english



Re: Apache::Cookie problems

2000-04-10 Thread Alvar Freude

Hi!

 get-cookie.html
 --
 %
 use Apache::Cookie;
 my $cookie_ref = Apache::Cookie-fetch;
 my $conf_cookie = $cookie_ref-{conf};
 my %hash  = $conf_cookie-value;
  ^^^

thats it, now I understand!

many thanks, it was too late last night for me ;-)


Nevertheless I now store the values in a tied Apache::Session-hash,
because so nobody can access my internal variables by setting an own
cookie :-)


If someone want to relax from hard coing work: Visit
http://www.assoziations-blaster.de/english/ (cookies are set for the
user's config).

It's an Non-Commercian net.art project, the main feature is real-time
linking of Texts, completely written in mod_perl :-)


Ciao
  Alvar

-- 
Alvar Freude
[EMAIL PROTECTED]

Besuche den Assoziations-Blaster: http://www.assoziations-blaster.de/



Apache::Cookie problems

2000-04-09 Thread Alvar Freude

Hi,

I have some problems in setting and getting back Cookie Values with
Apache::Cookie.


I'm setting a cookie with Apache::Cookie and it seems that the cookie is
set correct:


my $c = Apache::Cookie-new($r,
-name= 'conf',
-value   = $hash_ref,
-expires = '+30D',
-path= '/'
);
$c-bake;


But while Recovering the Cookie I got some errors:

$cookie_ref = Apache::Cookie-fetch;


Now $cookoe_ref-{conf} contains not a reference to the old values, it
contains a skalar like "Apache::Cookie=SCALAR(0x8b675b8)", but no
hashref ...


Whats going on? It seems that I'm too stupid, argl!


The CGI::Cookie-POD only sais: 

   fetch returns an associative array consisting of all
   cookies returned by the browser.  The keys of the array
   are the cookie names.  

OK, OK, but what's with the Values of the Hash?
How can I get the Cookie Values back?

Whats going wrong?



Thanx and Ciao
  Alvar



Embperl and session tracking: problems

2000-01-22 Thread Alvar Freude

Hi list,

I have a problem with Embperl and session tracking:

The stored values seem to be different for each running httpd, the
session data seem to be not stored globally / shared between all the
running httpd. I tested it with a simple counter, and it counts up until
a next httpd handles the request and then the counter of this httpd is
used, I guess. But thats not good for session tracking ... :-)



My embperl code is:


 = = =

html

h1testitesth1

Session ID ... Hm!
brb

[+ $udat{testi}++; +]
br
[+ $mdat{yeppa}++; +]

/html

 = = =


I am using a mod_perl build in apache, without DSO.


My start-up.pl scipt is the following:


 = = =

#
# Hier werden mal die ganzen ueblichen Module reingezogen ...
#

use Apache::DBI;
use DBI;
use Apache::StatINC;
use CGI;
CGI-compile(':all');

use Fcntl qw(:DEFAULT :flock);
use Socket;

use Apache::SIG ();
Apache::SIG-set;

BEGIN
{
$ENV{EMBPERL_SESSION_CLASSES} = "DBIStore SysVSemaphoreLocker" ;
$ENV{EMBPERL_SESSION_ARGS}= "DataSource=dbi:mysql:session
UserName=session Password=x" ;
} ;
use HTML::Embperl;

1;

 = = =



Also with 

 $ENV{EMBPERL_SESSION_CLASSES} = "MemoryStore NullLocker" ;

the same problem happens.


The cookie is set.
The mysql atabase is not updated every request, but I think
Apache::Session caches this.



But why the session tracking doesn't worl and the session data are not
shared between all running httpd?



Thanx and bye
  Alvar