Hello HakKers,

I need to transfer larges files (~GB) between 2 distants VM. I first used 
akka.io module which works great.
I currently take a look a akka-stream-experimental module to benefit from 
precious asynchronous backpressure.

But when I create a Flow from a Stream[ByteString], it keeps a reference on 
it. And because of Scala Stream memoization ==> OutOfMemory.
I tried with an iterator instead. But then my integrations tests fails. 
Seems to be that some chunk have disappeared or some concurrent access to 
the datasource.

>From reading the documentation of Flow in scaladsl package, I begin to 
think that streaming a file from 1 point to another is not a use case 
covered by reactive streams. Am I correct?
Can I expect some improvement in next release on this way?


To help, here is the code:

// streamer is an implements Iterable[Byte] and reads the file byte by byte.
val byteStream = new Streamer(buffStream)

// toStream is needed there to pass from Iterator[Iterable[Byte]] to 
Iterable[Iterable[Byte]] 
// because we need a flow of an Iterable[ByteString]
byteStream.grouped(chunkSize).toStream.map(_.toArray)
.map(ByteString.fromArray)

Flow(bytes).produceTo(materializer, client.outputStream)

Thanks for your help

-- 
>>>>>>>>>>      Read the docs: http://akka.io/docs/
>>>>>>>>>>      Check the FAQ: 
>>>>>>>>>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>>>>>>>>>      Search the archives: https://groups.google.com/group/akka-user
--- 
You received this message because you are subscribed to the Google Groups "Akka 
User List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to akka-user+unsubscr...@googlegroups.com.
To post to this group, send email to akka-user@googlegroups.com.
Visit this group at http://groups.google.com/group/akka-user.
For more options, visit https://groups.google.com/d/optout.

Reply via email to