Re: [akka-user] Re: Streaming http call gives EntityStreamSizeException (2.0-M2)

2015-12-19 Thread Jeroen Gordijn
FYI I created this issue: https://github.com/akka/akka/issues/19237

Any idea on:
"One thing I do notice is that the CPU keeps running high whenever I kill 
'curl'. Is there something I should do to close the stream? Suspending curl 
works fine though."

-- 
>>  Read the docs: http://akka.io/docs/
>>  Check the FAQ: 
>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>  Search the archives: https://groups.google.com/group/akka-user
--- 
You received this message because you are subscribed to the Google Groups "Akka 
User List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to akka-user+unsubscr...@googlegroups.com.
To post to this group, send email to akka-user@googlegroups.com.
Visit this group at https://groups.google.com/group/akka-user.
For more options, visit https://groups.google.com/d/optout.


Re: [akka-user] Re: Streaming http call gives EntityStreamSizeException (2.0-M2)

2015-12-18 Thread Konrad Malawski
It's a feature. (yes, really) :-)

Allow me to explain; Akka HTTP always opts on the safe side of things.
For example, if you write an endpoint that can get POST data, and someone (an 
attacker) sends you data and it never ends sending...
You do want to kill that connection as soon as you notice something fishy is 
going on (i.e. perhaps an attack, someone sending loads of data
and you never expected them to).

So the default is to play safe, and limit any HTTP Entity to 8M (which is 
pretty huge anyway, if you think about it – we've heard
people say "whoa, that's a huge default!", but of course they'd trim it down to 
a lower setting for production, suiting their needs).

If you know that all calls you do will use streaming instead of "accumulate 
that infinite stream as String" (you can do that, right),
you can disable this check by using the setting described here: 
http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#Limiting_message_entity_length

So the value of:
akka.http.[server|client].parsing.max-content-length
(Depending if you want it for server or client).

Last week I also improved that exception to be more self-explanatory, how it 
looks like this ( https://github.com/akka/akka/pull/19158 ):
  s"EntityStreamSizeException: actual entity size ($actualSize) exceeded 
content length limit ($limit bytes)! " +
  s"You can configure this by setting 
`akka.http.[server|client].parsing.max-content-length` or calling 
`HttpEntity.withSizeLimit` " +
  s"before materializing the dataBytes stream."

So your question will have been answered by the exception itself hopefully :-)


I also strongly recommend you have a look at this workshop I did on Scala 
Exchange a week ago:
https://skillsmatter.com/skillscasts/6869-workshop-end-to-end-asynchronous-back-pressure-with-akka-streams
It goes into depth who and how these things work the way they work.

Thanks for trying out the early 2.0 milestones, we'll be releasing a "2.0" very 
soon, please upgrade to it then! :-)
Hope this helps, happy hakking!

-- 
Cheers,
Konrad 'ktoso’ Malawski
Akka @ Typesafe

On 18 December 2015 at 14:02:30, Jeroen Gordijn (jeroen.gord...@gmail.com) 
wrote:

Same code with akka-streams 1.0 works fine.

Is this a bug?

Regards,
Jeroen

Op donderdag 17 december 2015 22:16:17 UTC+1 schreef Jeroen Gordijn:
Hi all,

I'm running into an EntityStreamSizeException when streaming data from a 
streaming response I got by calling another endpoint.. It is a little bit like 
presented in the talk by Mathias & Johannes at scalaworld: 
https://www.youtube.com/watch?v=6VBn9V3S2aQ

I'm using with akka-http 2.0-M2 and created my problem in isolation. See the 
route (and link to full gist below). When I call `curl -i 
http://localhost:8080/endless` the stream will continue indefinitely. However, 
when I call `curl -i http://localhost:8080/pipe` it takes a few seconds to get 
"curl: (56) Recv failure: Connection reset by peer" on the client an the 
exception below on the server. The source below is just an example to isolate 
the problem.

Am I doing something wrong? I would expect an endless stream and no limit. I'm 
using Chunked as stated in 
http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#HttpEntity

Thanks!
Jeroen

val source: Source[Int, Unit] = Source(Stream.from(1))

val route = (path("endless") & get) {
  complete {
HttpResponse(
  entity = HttpEntity.Chunked(
MediaTypes.`text/plain`,
source.map(nr ⇒ ByteString((nr.toString * 10) + "\n", "UTF-8"))
  )
)
  }
} ~
  (path("pipe") & get) {
val s = Http().singleRequest(HttpRequest(uri = 
"http://localhost:8080/endless;)).map {
  _.entity.dataBytes
.via(Framing.delimiter(ByteString("\n"),
  maximumFrameLength = 1, allowTruncation = true))
.map(entry ⇒ entry.utf8String)
}
onSuccess(s) { x ⇒
  complete(HttpResponse(
entity = HttpEntity.Chunked(
  MediaTypes.`text/plain`,
  x.map(x ⇒ ByteString(x + "\n", "UTF-8")
  )
)))
}
  }


Full gist: https://gist.github.com/jgordijn/390c9022062cfb9fce8c

Exception:
[ERROR] [12/17/2015 22:06:10.493] [Test-akka.actor.default-dispatcher-4] 
[ActorSystem(Test)] Outgoing request stream error
EntityStreamSizeException(8388608, None)
at 
akka.http.scaladsl.model.HttpEntity$$anonfun$limitable$1$$anon$1.onPush(HttpEntity.scala:469)
at 
akka.http.scaladsl.model.HttpEntity$$anonfun$limitable$1$$anon$1.onPush(HttpEntity.scala:451)
at 
akka.stream.stage.AbstractStage$PushPullGraphLogic$$anon$1.onPush(Stage.scala:54)
at 
akka.stream.impl.fusing.GraphInterpreter.processElement$1(GraphInterpreter.scala:535)
at 
akka.stream.impl.fusing.GraphInterpreter.processEvent(GraphInterpreter.scala:546)
at akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:509)
at 

Re: [akka-user] Re: Streaming http call gives EntityStreamSizeException (2.0-M2)

2015-12-18 Thread Jeroen Gordijn
Hi Konrad,

thanks for you answer. This explains a lot and makes sense. Configuring 
'infinite' fixes my issue. The new error description makes it a lot easier.

One thing I do notice is that the CPU keeps running high whenever I kill 
'curl'. Is there something I should do to close the stream? Suspending curl 
works fine though.

Thanks,
Jeroen


Op vrijdag 18 december 2015 14:17:47 UTC+1 schreef Konrad Malawski:
>
> It's a feature. (yes, really) :-)
>
> Allow me to explain; Akka HTTP always opts on the safe side of things.
> For example, if you write an endpoint that can get POST data, and someone 
> (an attacker) sends you data and it never ends sending...
> You do want to kill that connection as soon as you notice something fishy 
> is going on (i.e. perhaps an attack, someone sending loads of data
> and you never expected them to).
>
> So the default is to play safe, and limit any HTTP Entity to 8M (which is 
> pretty huge anyway, if you think about it – we've heard
> people say "whoa, that's a huge default!", but of course they'd trim it 
> down to a lower setting for production, suiting their needs).
>
> If you know that all calls you do will use streaming instead of 
> "accumulate that infinite stream as String" (you can do that, right),
> you can disable this check by using the setting described here: 
>
> http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#Limiting_message_entity_length
>
> So the value of:
> akka.http.[server|client].parsing.max-content-length
> (Depending if you want it for server or client).
>
> Last week I also improved that exception to be more self-explanatory, how 
> it looks like this ( https://github.com/akka/akka/pull/19158 ):
> s"EntityStreamSizeException: actual entity size ($actualSize) exceeded 
> content length limit ($limit bytes)! " +
> s"You can configure this by setting 
> `akka.http.[server|client].parsing.max-content-length` or calling 
> `HttpEntity.withSizeLimit` " +
> s"before materializing the dataBytes stream."
>
> So your question will have been answered by the exception itself hopefully 
> :-)
>
>
> I also strongly recommend you have a look at this workshop I did on Scala 
> Exchange a week ago:
>
> https://skillsmatter.com/skillscasts/6869-workshop-end-to-end-asynchronous-back-pressure-with-akka-streams
> It goes into depth who and how these things work the way they work.
>
> Thanks for trying out the early 2.0 milestones, we'll be releasing a "2.0" 
> very soon, please upgrade to it then! :-)
> Hope this helps, happy hakking!
>
> -- 
> Cheers,
> Konrad 'ktoso’ Malawski
> Akka  @ Typesafe 
>
> On 18 December 2015 at 14:02:30, Jeroen Gordijn (jeroen@gmail.com 
> ) wrote:
>
> Same code with akka-streams 1.0 works fine. 
>
> Is this a bug?
>
> Regards,
> Jeroen
>
> Op donderdag 17 december 2015 22:16:17 UTC+1 schreef Jeroen Gordijn: 
>>
>> Hi all, 
>>
>> I'm running into an EntityStreamSizeException when streaming data from a 
>> streaming response I got by calling another endpoint.. It is a little bit 
>> like presented in the talk by Mathias & Johannes at scalaworld: 
>> https://www.youtube.com/watch?v=6VBn9V3S2aQ
>>
>> I'm using with akka-http 2.0-M2 and created my problem in isolation. See 
>> the route (and link to full gist below). When I call `curl -i 
>> http://localhost:8080/endless`  the 
>> stream will continue indefinitely. However, when I call `curl -i 
>> http://localhost:8080/pipe`  it takes a few 
>> seconds to get "curl: (56) Recv failure: Connection reset by peer" on the 
>> client an the exception below on the server. The source below is just an 
>> example to isolate the problem.
>>
>> Am I doing something wrong? I would expect an endless stream and no 
>> limit. I'm using Chunked as stated in 
>> http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#HttpEntity
>>
>> Thanks!
>> Jeroen
>>
>> val source: Source[Int, Unit] = Source(Stream.from(1))
>>
>> val route = (path("endless") & get) {
>>   complete {
>> HttpResponse(
>>   entity = HttpEntity.Chunked(
>> MediaTypes.`text/plain`,
>> source.map(nr ⇒ ByteString((nr.toString * 10) + "\n", "UTF-8"))
>>   )
>> )
>>   }
>> } ~
>>   (path("pipe") & get) {
>> val s = Http().singleRequest(HttpRequest(uri = 
>> "http://localhost:8080/endless;)).map {
>>   _.entity.dataBytes
>> .via(Framing.delimiter(ByteString("\n"),
>>   maximumFrameLength = 1, allowTruncation = true))
>> .map(entry ⇒ entry.utf8String)
>> }
>> onSuccess(s) { x ⇒
>>   complete(HttpResponse(
>> entity = HttpEntity.Chunked(
>>   MediaTypes.`text/plain`,
>>   x.map(x ⇒ ByteString(x + "\n", "UTF-8")
>>   )
>> )))
>> }
>>   }
>>
>>
>>
>> *Full gist*: https://gist.github.com/jgordijn/390c9022062cfb9fce8c
>>
>> 

Re: [akka-user] Re: Streaming http call gives EntityStreamSizeException (2.0-M2)

2015-12-18 Thread Jeroen Gordijn
BTW is there a way to set infinite on a specific entity? I set put this in 
my application conf "max-content-length = infinite", but it seems 
reasonable to do this only for the entities where it makes sense. 
WithSizeLimit takes a long and although I could use Long.MaxValue, it 
doesn't state infinite.

--Jeroen

Op vrijdag 18 december 2015 15:16:40 UTC+1 schreef Jeroen Gordijn:
>
> Hi Konrad,
>
> thanks for you answer. This explains a lot and makes sense. Configuring 
> 'infinite' fixes my issue. The new error description makes it a lot easier.
>
> One thing I do notice is that the CPU keeps running high whenever I kill 
> 'curl'. Is there something I should do to close the stream? Suspending curl 
> works fine though.
>
> Thanks,
> Jeroen
>
>
> Op vrijdag 18 december 2015 14:17:47 UTC+1 schreef Konrad Malawski:
>>
>> It's a feature. (yes, really) :-)
>>
>> Allow me to explain; Akka HTTP always opts on the safe side of things.
>> For example, if you write an endpoint that can get POST data, and someone 
>> (an attacker) sends you data and it never ends sending...
>> You do want to kill that connection as soon as you notice something fishy 
>> is going on (i.e. perhaps an attack, someone sending loads of data
>> and you never expected them to).
>>
>> So the default is to play safe, and limit any HTTP Entity to 8M (which is 
>> pretty huge anyway, if you think about it – we've heard
>> people say "whoa, that's a huge default!", but of course they'd trim it 
>> down to a lower setting for production, suiting their needs).
>>
>> If you know that all calls you do will use streaming instead of 
>> "accumulate that infinite stream as String" (you can do that, right),
>> you can disable this check by using the setting described here: 
>>
>> http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#Limiting_message_entity_length
>>
>> So the value of:
>> akka.http.[server|client].parsing.max-content-length
>> (Depending if you want it for server or client).
>>
>> Last week I also improved that exception to be more self-explanatory, how 
>> it looks like this ( https://github.com/akka/akka/pull/19158 ):
>> s"EntityStreamSizeException: actual entity size ($actualSize) exceeded 
>> content length limit ($limit bytes)! " +
>> s"You can configure this by setting 
>> `akka.http.[server|client].parsing.max-content-length` or calling 
>> `HttpEntity.withSizeLimit` " +
>> s"before materializing the dataBytes stream."
>>
>> So your question will have been answered by the exception itself 
>> hopefully :-)
>>
>>
>> I also strongly recommend you have a look at this workshop I did on Scala 
>> Exchange a week ago:
>>
>> https://skillsmatter.com/skillscasts/6869-workshop-end-to-end-asynchronous-back-pressure-with-akka-streams
>> It goes into depth who and how these things work the way they work.
>>
>> Thanks for trying out the early 2.0 milestones, we'll be releasing a 
>> "2.0" very soon, please upgrade to it then! :-)
>> Hope this helps, happy hakking!
>>
>> -- 
>> Cheers,
>> Konrad 'ktoso’ Malawski
>> Akka  @ Typesafe 
>>
>> On 18 December 2015 at 14:02:30, Jeroen Gordijn (jeroen@gmail.com) 
>> wrote:
>>
>> Same code with akka-streams 1.0 works fine. 
>>
>> Is this a bug?
>>
>> Regards,
>> Jeroen
>>
>> Op donderdag 17 december 2015 22:16:17 UTC+1 schreef Jeroen Gordijn: 
>>>
>>> Hi all, 
>>>
>>> I'm running into an EntityStreamSizeException when streaming data from a 
>>> streaming response I got by calling another endpoint.. It is a little bit 
>>> like presented in the talk by Mathias & Johannes at scalaworld: 
>>> https://www.youtube.com/watch?v=6VBn9V3S2aQ
>>>
>>> I'm using with akka-http 2.0-M2 and created my problem in isolation. See 
>>> the route (and link to full gist below). When I call `curl -i 
>>> http://localhost:8080/endless`  the 
>>> stream will continue indefinitely. However, when I call `curl -i 
>>> http://localhost:8080/pipe`  it takes a few 
>>> seconds to get "curl: (56) Recv failure: Connection reset by peer" on the 
>>> client an the exception below on the server. The source below is just an 
>>> example to isolate the problem.
>>>
>>> Am I doing something wrong? I would expect an endless stream and no 
>>> limit. I'm using Chunked as stated in 
>>> http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#HttpEntity
>>>
>>> Thanks!
>>> Jeroen
>>>
>>> val source: Source[Int, Unit] = Source(Stream.from(1))
>>>
>>> val route = (path("endless") & get) {
>>>   complete {
>>> HttpResponse(
>>>   entity = HttpEntity.Chunked(
>>> MediaTypes.`text/plain`,
>>> source.map(nr ⇒ ByteString((nr.toString * 10) + "\n", "UTF-8"))
>>>   )
>>> )
>>>   }
>>> } ~
>>>   (path("pipe") & get) {
>>> val s = Http().singleRequest(HttpRequest(uri = 
>>> "http://localhost:8080/endless;)).map {
>>>   

Re: [akka-user] Re: Streaming http call gives EntityStreamSizeException (2.0-M2)

2015-12-18 Thread Viktor Klang
Or use a negative value for without limit ;)

On Fri, Dec 18, 2015 at 3:25 PM, Konrad Malawski <
konrad.malaw...@typesafe.com> wrote:

> Yeah, you can use .withSizeLimit(Long.MaxValue), which is pretty much
> infinite in this context,
> It's around 9223 petabytes, pumping so much in a single http connection is
> rather unlikely :-)
>
> It could be made simpler I guess though withNoLimit hm... Would you open a
> ticket on github about it please? Thanks!
>
> --
> Cheers,
> Konrad 'ktoso’ Malawski
> Akka  @ Typesafe 
>
> On 18 December 2015 at 15:22:53, Jeroen Gordijn (jeroen.gord...@gmail.com)
> wrote:
>
> BTW is there a way to set infinite on a specific entity? I set put this in
> my application conf "max-content-length = infinite", but it seems
> reasonable to do this only for the entities where it makes sense.
> WithSizeLimit takes a long and although I could use Long.MaxValue, it
> doesn't state infinite.
>
> --Jeroen
>
> Op vrijdag 18 december 2015 15:16:40 UTC+1 schreef Jeroen Gordijn:
>>
>> Hi Konrad,
>>
>> thanks for you answer. This explains a lot and makes sense. Configuring
>> 'infinite' fixes my issue. The new error description makes it a lot easier.
>>
>> One thing I do notice is that the CPU keeps running high whenever I kill
>> 'curl'. Is there something I should do to close the stream? Suspending curl
>> works fine though.
>>
>> Thanks,
>> Jeroen
>>
>>
>> Op vrijdag 18 december 2015 14:17:47 UTC+1 schreef Konrad Malawski:
>>>
>>> It's a feature. (yes, really) :-)
>>>
>>> Allow me to explain; Akka HTTP always opts on the safe side of things.
>>> For example, if you write an endpoint that can get POST data, and
>>> someone (an attacker) sends you data and it never ends sending...
>>> You do want to kill that connection as soon as you notice something
>>> fishy is going on (i.e. perhaps an attack, someone sending loads of data
>>> and you never expected them to).
>>>
>>> So the default is to play safe, and limit any HTTP Entity to 8M (which
>>> is pretty huge anyway, if you think about it – we've heard
>>> people say "whoa, that's a huge default!", but of course they'd trim it
>>> down to a lower setting for production, suiting their needs).
>>>
>>> If you know that all calls you do will use streaming instead of
>>> "accumulate that infinite stream as String" (you can do that, right),
>>> you can disable this check by using the setting described here:
>>>
>>> http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#Limiting_message_entity_length
>>>
>>> So the value of:
>>> akka.http.[server|client].parsing.max-content-length
>>> (Depending if you want it for server or client).
>>>
>>> Last week I also improved that exception to be more self-explanatory,
>>> how it looks like this ( https://github.com/akka/akka/pull/19158 ):
>>> s"EntityStreamSizeException: actual entity size ($actualSize) exceeded
>>> content length limit ($limit bytes)! " +
>>> s"You can configure this by setting
>>> `akka.http.[server|client].parsing.max-content-length` or calling
>>> `HttpEntity.withSizeLimit` " +
>>> s"before materializing the dataBytes stream."
>>>
>>> So your question will have been answered by the exception itself
>>> hopefully :-)
>>>
>>>
>>> I also strongly recommend you have a look at this workshop I did on
>>> Scala Exchange a week ago:
>>>
>>> https://skillsmatter.com/skillscasts/6869-workshop-end-to-end-asynchronous-back-pressure-with-akka-streams
>>> It goes into depth who and how these things work the way they work.
>>>
>>> Thanks for trying out the early 2.0 milestones, we'll be releasing a
>>> "2.0" very soon, please upgrade to it then! :-)
>>> Hope this helps, happy hakking!
>>>
>>> --
>>> Cheers,
>>> Konrad 'ktoso’ Malawski
>>> Akka  @ Typesafe 
>>>
>>> On 18 December 2015 at 14:02:30, Jeroen Gordijn (jeroen@gmail.com)
>>> wrote:
>>>
>>> Same code with akka-streams 1.0 works fine.
>>>
>>> Is this a bug?
>>>
>>> Regards,
>>> Jeroen
>>>
>>> Op donderdag 17 december 2015 22:16:17 UTC+1 schreef Jeroen Gordijn:

 Hi all,

 I'm running into an EntityStreamSizeException when streaming data from
 a streaming response I got by calling another endpoint.. It is a little bit
 like presented in the talk by Mathias & Johannes at scalaworld:
 https://www.youtube.com/watch?v=6VBn9V3S2aQ

 I'm using with akka-http 2.0-M2 and created my problem in isolation.
 See the route (and link to full gist below). When I call `curl -i
 http://localhost:8080/endless`  the
 stream will continue indefinitely. However, when I call `curl -i
 http://localhost:8080/pipe`  it takes a
 few seconds to get "curl: (56) Recv failure: Connection reset by peer" on
 the client an the exception below on the server. The source below is just
 an example to isolate the problem.

 Am I doing 

Re: [akka-user] Re: Streaming http call gives EntityStreamSizeException (2.0-M2)

2015-12-18 Thread Konrad Malawski
Yeah, you can use .withSizeLimit(Long.MaxValue), which is pretty much infinite 
in this context,
It's around 9223 petabytes, pumping so much in a single http connection is 
rather unlikely :-)

It could be made simpler I guess though withNoLimit hm... Would you open a 
ticket on github about it please? Thanks!

-- 
Cheers,
Konrad 'ktoso’ Malawski
Akka @ Typesafe

On 18 December 2015 at 15:22:53, Jeroen Gordijn (jeroen.gord...@gmail.com) 
wrote:

BTW is there a way to set infinite on a specific entity? I set put this in my 
application conf "max-content-length = infinite", but it seems reasonable to do 
this only for the entities where it makes sense. WithSizeLimit takes a long and 
although I could use Long.MaxValue, it doesn't state infinite.

--Jeroen

Op vrijdag 18 december 2015 15:16:40 UTC+1 schreef Jeroen Gordijn:
Hi Konrad,

thanks for you answer. This explains a lot and makes sense. Configuring 
'infinite' fixes my issue. The new error description makes it a lot easier.

One thing I do notice is that the CPU keeps running high whenever I kill 
'curl'. Is there something I should do to close the stream? Suspending curl 
works fine though.

Thanks,
Jeroen


Op vrijdag 18 december 2015 14:17:47 UTC+1 schreef Konrad Malawski:
It's a feature. (yes, really) :-)

Allow me to explain; Akka HTTP always opts on the safe side of things.
For example, if you write an endpoint that can get POST data, and someone (an 
attacker) sends you data and it never ends sending...
You do want to kill that connection as soon as you notice something fishy is 
going on (i.e. perhaps an attack, someone sending loads of data
and you never expected them to).

So the default is to play safe, and limit any HTTP Entity to 8M (which is 
pretty huge anyway, if you think about it – we've heard
people say "whoa, that's a huge default!", but of course they'd trim it down to 
a lower setting for production, suiting their needs).

If you know that all calls you do will use streaming instead of "accumulate 
that infinite stream as String" (you can do that, right),
you can disable this check by using the setting described here: 
http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#Limiting_message_entity_length

So the value of:

akka.http.[server|client].parsing.max-content-length
(Depending if you want it for server or client).

Last week I also improved that exception to be more self-explanatory, how it 
looks like this ( https://github.com/akka/akka/pull/19158 ):

s"EntityStreamSizeException:
actual entity size ($actualSize) exceeded content length limit
($limit bytes)! "  
+

s"You
can configure this by setting
`akka.http.[server|client].parsing.max-content-length` or
calling `HttpEntity.withSizeLimit` "  
+

s"before
materializing the dataBytes
stream."

So your question will have been answered by the exception itself hopefully :-)


I also strongly recommend you have a look at this workshop I did on Scala 
Exchange a week ago:
https://skillsmatter.com/skillscasts/6869-workshop-end-to-end-asynchronous-back-pressure-with-akka-streams
It goes into depth who and how these things work the way they work.

Thanks for trying out the early 2.0 milestones, we'll be releasing a "2.0" very 
soon, please upgrade to it then! :-)
Hope this helps, happy hakking!

-- 
Cheers,
Konrad 'ktoso’ Malawski
Akka @ Typesafe

On 18 December 2015 at 14:02:30, Jeroen Gordijn (jeroen@gmail.com) wrote:

Same code with akka-streams 1.0 works fine.

Is this a bug?

Regards,
Jeroen

Op donderdag 17 december 2015 22:16:17 UTC+1 schreef Jeroen Gordijn:
Hi all,

I'm running into an EntityStreamSizeException when streaming data from a 
streaming response I got by calling another endpoint.. It is a little bit like 
presented in the talk by Mathias & Johannes at scalaworld: 
https://www.youtube.com/watch?v=6VBn9V3S2aQ

I'm using with akka-http 2.0-M2 and created my problem in isolation. See the 
route (and link to full gist below). When I call `curl -i 
http://localhost:8080/endless` the stream will continue indefinitely. However, 
when I call `curl -i http://localhost:8080/pipe` it takes a few seconds to get 
"curl: (56) Recv failure: Connection reset by peer" on the client an the 
exception below on the server. The source below is just an example to isolate 
the problem.

Am I doing something wrong? I would expect an endless stream and no limit. I'm 
using Chunked as stated in 
http://doc.akka.io/docs/akka-stream-and-http-experimental/snapshot/scala/http/common/http-model.html#HttpEntity

Thanks!
Jeroen

val source: Source[Int, Unit] = Source(Stream.from(1))

val route = (path("endless") & get) {
  complete {
HttpResponse(
  entity = HttpEntity.Chunked(
MediaTypes.`text/plain`,
source.map(nr ⇒ ByteString((nr.toString * 10) + "\n", "UTF-8"))
  )
)
  }
} ~
  (path("pipe") & get) {
val s = Http().singleRequest(HttpRequest(uri = 
"http://localhost:8080/endless;)).map {