Currently for our project we are collecting data and pushing into Kafka with
messages are in Avro format.  We need to push this data into HDFS and we are
using SparkStreaming and in HDFS also it is stored in Avro format. We are
partitioning the data per each day. So when we write data into HDFS we need
to append to the same file. Curenttly we are using GenericRecordWriter and
we will be using saveAsNewAPIHadoopFile for writing into HDFS.Is there a way
to append data into file in HDFS with Avro format using
saveAsNewAPIHadoopFile ?Thanks,Santosh B



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/AVRO-Append-HDFS-using-saveAsNewAPIHadoopFile-tp28292.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to