Environment: MQSeries 5.2 on OS390 and CICS
Hello,
I have never done any MQcoding myself, so please excuse me if this sounds very dumb....
It is about the response time of a CICS transaction and the wait interval coded on the MQGET.
Our application programmers are working on a CICS transaction doing PUT(request) to an MQ queue and doing a GET with WAIT from a reply queue.
The response time of this transaction is always equal to the Waitinterval coded in the MQGET call.
As I understand it, waitinterval is the max time, the MQGET call waits for the messages to arrives in the queue.
Now if we code a waitinterval of 30seconds and say, 10 message arrives in the queue within 2seconds,
the program should be able to process these 10 messages immediately as they arrive in the queue right?
Then what are we doing wrong here?
Any inputs will be greatly appreciated.
Thanks, Prince