Hi Chandan/Jürgen,

I had tried through a native code having single input data frame with
multiple sinks as :

Spark provides a method called awaitAnyTermination() in
StreamingQueryManager.scala which provides all the required details to
handle the query processed by spark.By observing documentation of spark with
below points :
                -> Wait until any of the queries on the associated SQLContext 
has
terminated since the creation of the context, or since `resetTerminated()`
was called. If any query was terminated
                -> If a query has terminated, then subsequent calls to
`awaitAnyTermination()` will either return immediately (if the query was
terminated  by `query.stop()`),or throw the exception immediately (if the
query was terminated with exception). Use `resetTerminated()` to clear past
terminations and wait for new terminations.
                -> In the case where multiple queries have terminated since
`resetTermination()` was called, if any query has terminated with exception,
when `awaitAnyTermination()` will throw any of the exception. For correctly
documenting exceptions across multiple queries,users need to  stop all of
them after any of them terminates with exception, and then check the
`query.exception()` for each query.     

    
val inputdf:DataFrame =
sparkSession.readStream.schema(schema).format("csv").option("delimiter",",").csv("src/main/streamingInput")
    query1 =
inputdf.writeStream.option("path","first_output").option("checkpointLocation","checkpointloc").format("csv").start()
    query2 =
inputdf.writeStream.option("path","second_output").option("checkpointLocation","checkpoint2").format("csv").start()
    sparkSession.streams.awaitAnyTermination()


Now, both "first_output" and "second_output" file write successfully.

Try it out on your site and let me know if you found any limitation.And try
to posting if you found any other way.

Let me correct if i had grammatical mistake.

Thanks
Amiya



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to