This is not really an Ignite question. Try asking it on Spark userlist:
http://apache-spark-user-list.1001560.n3.nabble.com/

Running commands with & is a valid approach though.
You can also try using nohup <https://linux.die.net/man/1/nohup>.

Denis

вс, 12 авг. 2018 г. в 5:12, ApacheUser <bhaskar.thungathu...@gmail.com>:

> Hello Ignite Team,
>
> I have Spark job thats streams live data into Ignite Cache . The  job gets
> closed as soon as I close window(Linux shell) . The other spark streaming
> jobs I run with "&" at the end of spark submit job and they run for very
> long time untill they I stop or crash due to other factors etc.
>
> Is there any way I can run Spark-Ignite job continuously?
>
> This is my spark submit:
>
> spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0
> --master spark://<IP>:7077  --executor-cores x --total-executor-cores x
> --executor-memory Xg --conf spark.driver.maxResultSize=Xg --driver-memory
> Xg
> --conf spark.default.parallelism=XX --conf
> spark.serializer=org.apache.spark.serializer.KryoSerializer   --class
> com.yyyy.yyyy.dataload <path to Jar>.jar  &
>
>
> Thanks
>
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>

Reply via email to