Using a single SparkContext for an extended period of time is how
long-running Spark Applications such as the Spark Job Server work (
https://github.com/spark-jobserver/spark-jobserver). It's an established
pattern.
On Thu, Oct 27, 2016 at 11:46 AM, Gervásio Santos wrote:
>
Hi guys!
I'm developing an application in Spark that I'd like to run continuously.
It would execute some actions, sleep for a while and go again. I was
thinking of doing it in a standard infinite loop way.
val sc =
while (true) {
doStuff(...)
sleep(...)
}
I would be running this (fairly