No exceptions in any logs. No errors in stdout or stderr.

2014-05-22 11:21 GMT+02:00 Andrew Or <and...@databricks.com>:

> You should always call sc.stop(), so it cleans up state and does not fill
> up your disk over time. The strange behavior you observe is mostly benign,
> as it only occurs after you have supposedly finished all of your work with
> the SparkContext. I am not aware of a bug in Spark that causes this
> behavior.
>
> What are you doing in your application? Do you see any exceptions in the
> logs? Have you looked at the worker logs? You can browse through these on
> the worker web UI on http://<worker-url>:8081
>
> Andrew
>



-- 
Piotr Kolaczkowski, Lead Software Engineer
pkola...@datastax.com

http://www.datastax.com/
777 Mariners Island Blvd., Suite 510
San Mateo, CA 94404

Reply via email to