Hi Experts, I have a scenario, where in I want to write to a avro file from a streaming job that reads data from kafka.
But the issue is, as there are multiple executors and when all try to write to a given file I get a concurrent exception. I way to mitigate the issue is to repartition & have a single writer task, but as my data is huge that is not a feasible option. Any suggestions welcomed. Regards, Sam -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Writing-to-a-single-file-from-multiple-executors-tp22003.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org