I have a scenario where I would like to save the same streaming dataframe to
two different streaming sinks.

I have created a streaming dataframe which I need to send to both Kafka
topic and delta lake.

I thought of using forEachBatch, but looks like it doesn't support multiple
STREAMING SINKS.

Also, I tried using spark session.awaitAnyTermination() with multiple write
streams. But the second stream is not getting processed !

Is there a way through which we can achieve this ?!



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to