Re: Spark Structured Streaming Continuous Trigger on multiple sinks

2021-09-12 Thread Alex Ott
Just don't call .awaitTermindation() because it blocks execution of the next line of code. You can assign result of .start() to a specific variable, or put them into list/array. And to wait until one of the streams finishes, use spark.streams.awaitAnyTermination() or something like this

Spark Structured Streaming Continuous Trigger on multiple sinks

2021-08-25 Thread S
Hello, I have a structured streaming job that needs to be able to write to multiple sinks. We are using *Continuous* Trigger *and not* *Microbatch* Trigger. 1. When we use the foreach method using: *dataset1.writeStream.foreach(kafka ForEachWriter