On Thu, 2016-03-17 at 17:51 +0100, Joan Balagueró wrote:
> Hello,
>
> I hope this is the last question ... On our proxy we are reading the response
> from the backend with a FutureCallback<HttpResponse>. On the complete method,
> we process the response body in this way:
>
> public void completed(HttpResponse objResponse)
> {
> HttpEntity entity = null;
>
> try
> {
> entity = objResponse.getEntity();
> int contentLength = (int)entity.getContentLength();
> ByteArrayOutputStream baos = new ByteArrayOutputStream(contentLength > 0 ?
> contentLength : this.bufferHttpRespuestas);
>
> Bis = new BufferedInputStream(entity.getContent(),
> this.bufferHttpRespuestas);
> byte[] tmp = new byte[this.bufferHttpRespuestas];
> int numBytesRead;
>
> while ((numBytesRead = bis.read(tmp)) >= 0) baos.write(tmp, 0, numBytesRead);
> ( . . .)
>
>
> The response is read from the inputstream contained in 'objHttpResponse',
> this response is already read so no network calls here (we read locally from
> the ' objResponse' object).
> Reading the body in this way, am I storing this body twice, one in the '
> objResponse' and another in the 'baos' variable?
>
> In the link to the benchmark you sent me, a HttpAsyncResponseConsumer is
> used, so the content is processed in chunks on the consumeContent method
> (and I understand that the 'responseReceived' method is called once the
> headers have been processed).
>
> So the point is: is one of these methods better taking into account that my
> responses can be really large and I always need to store them in memory for
> further processing?
>
HttpAsyncRequestProducer and HttpAsyncResponseConsumer is the only
resource efficient way to transmit large messages with HttpAsyncClient.
Oleg
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]