Arnaud,
I have found a way around this. I don't know if your interested but
it goes likes something like this:
foreach my $param ($r->param) {
if ($param =~ /\busers\b/) {
$users{$r->param($param)} = 0;
}
....snip...then later
foreach my $key (keys %users) {
next if ($users{$key} == 1;
$users{$key} = 1;
}
The idea being you only work request that haven't been processed yet.
Once you process a request you increment that hash key to 1 and can
avoid using it again. IE still sends the request twice and it is
working with the first request not the second.
Just a thought.
Dp.
On 29 Jul 2004 at 16:20, Arnaud Blancher wrote:
> Dermot Paikkos a �crit :
>
> >Does this mean you have to go an clean up these files later
> >
> yes, if you dont want they stay on the disk.
>
> > or is
> >this done when the process ends?
> >
> maybe you can write a special handle for the directory where you ll
> write your pdf that delete the pdf when the connection (due to the
> redirect) will be close by the client (but i'not sure).
>
> > I don't want to slow the users down
> >unless I have to.
> >
> >I think I would like to determine the user-agent and work around the
> >repeating requests....somehow. Do you know how to find out the user-
> >agent when using Apache::Request? I can't see it when I use this
> >object. Thanx. Dp.
> >
> >
> >
> >
> >
> >
> >
> >
>
>
>
~~
Dermot Paikkos * [EMAIL PROTECTED]
Network Administrator @ Science Photo Library
Phone: 0207 432 1100 * Fax: 0207 286 8668
--
Report problems: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
List etiquette: http://perl.apache.org/maillist/email-etiquette.html