Hello Ignite Team,

I have Spark job thats streams live data into Ignite Cache . The  job gets
closed as soon as I close window(Linux shell) . The other spark streaming
jobs I run with "&" at the end of spark submit job and they run for very
long time untill they I stop or crash due to other factors etc.

Is there any way I can run Spark-Ignite job continuously?

This is my spark submit:

spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0
--master spark://<IP>:7077  --executor-cores x --total-executor-cores x
--executor-memory Xg --conf spark.driver.maxResultSize=Xg --driver-memory Xg
--conf spark.default.parallelism=XX --conf
spark.serializer=org.apache.spark.serializer.KryoSerializer   --class
com.yyyy.yyyy.dataload <path to Jar>.jar  &


Thanks




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Reply via email to