sparkcontext stop and then start again

2014-07-25 Thread Mohit Jaggi
Folks, I had some pyspark code which used to hang with no useful debug logs. It got fixed when I changed my code to keep the sparkcontext forever instead of stopping it and then creating another one later. Is this a bug or expected behavior? Mohit.

SparkContext#stop

2014-05-22 Thread Piotr Kołaczkowski
Hi, We observed strange behabiour of Spark 0.9.0 when using sc.stop(). We have a bunch of applications that perform some jobs and then issue sc.stop() at the end of main. Most of the time, everything works as desired, but sometimes the applications get marked as FAILED by the master and all

Re: SparkContext#stop

2014-05-22 Thread Andrew Or
You should always call sc.stop(), so it cleans up state and does not fill up your disk over time. The strange behavior you observe is mostly benign, as it only occurs after you have supposedly finished all of your work with the SparkContext. I am not aware of a bug in Spark that causes this

Re: SparkContext#stop

2014-05-22 Thread Piotr Kołaczkowski
No exceptions in any logs. No errors in stdout or stderr. 2014-05-22 11:21 GMT+02:00 Andrew Or and...@databricks.com: You should always call sc.stop(), so it cleans up state and does not fill up your disk over time. The strange behavior you observe is mostly benign, as it only occurs after