Hi guys! I'm developing an application in Spark that I'd like to run continuously. It would execute some actions, sleep for a while and go again. I was thinking of doing it in a standard infinite loop way.
val sc = <initialize spark context> while (true) { doStuff(...) sleep(...) } I would be running this (fairly light weight) application on a cluster, that would also run other (significantly heavier) jobs. However, I fear that this kind of code might lead to unexpected beahavior; I don't know if keeping the same SparkContext active continuously for a very long time might lead to some weird stuff happening. Can anyone tell me if there is some problem with not "renewing" the Spark context or is aware of any problmes with this approach that I might be missing? Thanks!