Thanks for the link. Currently we don't have a way to do something
like that, but if we could figure out how that data frame append code
works behind the scenes, then we could potentially offer something
similar.

On Thu, Nov 30, 2017 at 9:44 PM, VinShar <vinaysharm...@gmail.com> wrote:
> yes this was my understanding also but then i found that Spark's DataFrame
> does has a method which appends to Parquet ( df.write.parquet(destName,
> mode="append")). below is an article that throws some light on this. i was
> wondering if there is a way to achieve the same through NiFi.
>
> http://aseigneurin.github.io/2017/03/14/incrementally-loaded-parquet-files.html
>
> I have a workaround in mind for this where i can save data i want to append
> to parque in a file (say in avro format) and then execute a script through
> ExecuteProcess to launch a spark job to read avro and append to an existing
> Parquet file and then delete avro. I am looking for a simpler way than this.
>
>
>
> --
> Sent from: http://apache-nifi-developer-list.39713.n7.nabble.com/

Reply via email to