Le 02/06/2021 à 21:57, Rares Vernica a écrit :
Thanks for the pointers! The migration is going well.

We have been using Arrow 0.16.0 RecordBatchStreamWriter
<https://github.com/Paradigm4/bridge/blob/master/src/PhysicalXSave.cpp#L450>
with & without CompressedOutputStream and wrote the resulting Arrow Buffer
data to S3
<https://github.com/Paradigm4/bridge/blob/master/src/S3Driver.cpp#L168> or file
system
<https://github.com/Paradigm4/bridge/blob/master/src/FSDriver.cpp#L156>. We
have a sizable amount of data saved this way.

Once we upgrade our C++ code to use Arrow 3.0.0 or 4.0.0, will it be
possible to read the Arrow steam files written with Arrow 0.16.0?

It definitely should, unless it's a bug.
That said, for extra safety, I suggest you test loading the files before doing the final migration.

By the way, for saving lots of data to S3, it may be more efficient to use Parquet. It will be more CPU-intensive but will result in significant space savings.

Regards

Antoine.

Reply via email to