Keith Kwiatek wrote:
> 
> Hello,
> 
> I have a mod_perl application that takes a request from a client,  then does
> some transaction processing with a remote system, which then returns a
> success/fail result to the client. The transaction MUST happen only ONCE per
> client session.
> 
> PROBLEM: the client clicks the submit buttom twice thus sending two
> requests, spawning two different processes to do the same remote
> transaction. BUT, the client request MUST be processed only ONCE for a given
> session_id. The first request will start a process to initiate the remote
> transaction, and then the second request process start, not knowing about
> the first  process. The result is that the client has the transaction
> performed two times!
> 
> How do you handle this? My first thought is to write a "processing status"
> value to the session hash (using apache::session) AS SOON as the first
> request is received, and then when the second duplicate request is received,
> check the "processing status" in the session hash. If the processing status
> is "in progress", then wait till the processing status in the session hash
> is updated by the first request process and return the result.....
> 
> Is my concept on target? Is my implementation right? (or should I write
> directly to the files system?)

Yes yes no.  Apache::Session effectively serializes all requests for the
same session_id, so using a flag in the session hash is race-safe.

-jwb

Reply via email to