Hi Jason,
The reader was not designed to read past EOF returned by input streams.  So
it would be up to you to create a wrapper around the FileInputStream that
can gracefully handle EOF by blocking and then waiting for more data (this
would of course require some form of external coordination to determine
when the last batch is sent).  I think one way to test this theory would be
to use System.in/System.out for your streams which IIRC will not return an
EOF until the program are forced killed.

IIRC, The java bindings aren't the best for high performance IPC because
they always require an extra copy (i.e. memmapping doesn't quite work).
Depending on exact requirements using Flight might simplify your solution
in this space.

Cheers,
Micah

On Sun, Feb 12, 2023 at 8:49 AM Jason Thomas <[email protected]>
wrote:

> Use case:
>
> - stream data continuously between 2 processes with very low latency
>
> Problem:
>
> I have an ArrowStreamWriter running in one process that continuously
> writes a single integer.  I have another process that continuously reads
> the data using ArrowStreamReader.  Everything works fine for a few seconds,
> but once my reader gets caught up to the writer I start getting exceptions
> and eventually data loss.  Here is the code and sample output -
> https://gist.github.com/jt70/69780c66ffb531040e97759264bd463a
>
> I'm not sure if this is a bug or something I'm doing wrong.  Is there a
> better way to do high performance IPC with Arrow?  Thanks for any tips!
>
> -Jason
>
>

Reply via email to