Is there a way to run multiple streams in a single Spark job using Structured Streaming?  If not, is there an easy way to perform inter-job communications (e.g. referencing a dataframe among concurrent jobs) in Spark?  Thanks a lot in advance!

-- ND

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to