Re: Processing Multiple Streams in a Single Job

2021-08-25 Thread Sean Owen
This part isn't Spark specific, just a matter of running code in parallel on the driver (that happens to start streaming jobs). In Scala it's things like .par collections, in Python it's something like multiprocessing. On Wed, Aug 25, 2021 at 8:48 AM Artemis User wrote: > Thanks Sean. Excuse

Re: Processing Multiple Streams in a Single Job

2021-08-25 Thread Artemis User
Thanks Sean.  Excuse my ignorant, but I just can't figure out how to create a collection across multiple streams using multiple stream readers.  Could you provide some examples or additional references? Thanks! On 8/24/21 11:01 PM, Sean Owen wrote: No, that applies to the streaming DataFrame

Re: From scala to sql

2021-08-25 Thread Mich Talebzadeh
That is spark on Hive database.table view my Linkedin profile *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this

Re: From scala to sql

2021-08-25 Thread Mich Talebzadeh
which spark-sql /opt/spark/bin/spark-sql ## usually $SPARK_HOME/bin/spark-sql spark-sql spark-sql> select count(1) from alayer.joint_accounts_view; 7926 Time taken: 0.506 seconds, Fetched 1 row(s) HTH view my Linkedin profile

From scala to sql

2021-08-25 Thread Rita RSilva
Good morning, I have just installed spark Apache, I never used it. I need to change from scala to SQL, but I am having some difficulties in finding the instructions for that. Could you help me with this please? - To

Spark Structured Streaming Continuous Trigger on multiple sinks

2021-08-25 Thread S
Hello, I have a structured streaming job that needs to be able to write to multiple sinks. We are using *Continuous* Trigger *and not* *Microbatch* Trigger. 1. When we use the foreach method using: *dataset1.writeStream.foreach(kafka ForEachWriter