Re: Possible memory leak after closing spark context in v2.0.1

2016-10-17 Thread Lev Katzav
I don't have in my code any object broadcasting. I do have broadcast join hints (df1.join(broadcast(df2))) I tried, starting and stopping the spark context for every test (and not once per suite), and it did stop the OOM errors, so I guess that there is no leakage after the context is stopped.

Re: Possible memory leak after closing spark context in v2.0.1

2016-10-17 Thread Sean Owen
Did you unpersist the broadcast objects? On Mon, Oct 17, 2016 at 10:02 AM lev wrote: > Hello, > > I'm in the process of migrating my application to spark 2.0.1, > And I think there is some memory leaks related to Broadcast joins. > > the application has many unit tests, > and

Possible memory leak after closing spark context in v2.0.1

2016-10-17 Thread lev
Hello, I'm in the process of migrating my application to spark 2.0.1, And I think there is some memory leaks related to Broadcast joins. the application has many unit tests, and each individual test suite passes, but when running all together, it fails on OOM errors. In the begging of each