Sorry, the bug link in previous mail was is wrong.
Here is the real link:
http://apache-spark-developers-list.1001551.n3.nabble.com/Re-SQL-Memory-leak-with-spark-streaming-and-spark-sql-in-spark-1-5-1-td14603.html
At 2016-05-13 09:49:05, "李明伟" wrote:
It seems
It seems we hit the same issue.
There was a bug on 1.5.1 about memory leak. But I am using 1.6.1
Here is the link about the bug in 1.5.1
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
At 2016-05-12 23:10:43, "Simon Schiff [via Apache Spark User List]"
Hi Simon
Can you describe your problem in more details?
I suspect that my problem is because the window function (or may be the groupBy
agg functions).
If you are the same. May be we should report a bug
At 2016-05-11 23:46:49, "Simon Schiff [via Apache Spark User List]"
Hi Ted
Spark version : spark-1.6.0-bin-hadoop2.6
I tried increase the memory of executor. Still have the same problem.
I can use jmap to capture some thing. But the output is too difficult to
understand.
在 2016-05-11 11:50:14,"Ted Yu" 写道:
Which Spark release