This issue on stackoverflow maybe help
https://stackoverflow.com/questions/42641573/why-does-memory-usage-of-spark-worker-increases-with-time/42642233#42642233
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-exectors-memory-increasing
-spark-user-list.1001560.n3.nabble.com/file/n28500/QQ20170317-095331%402x.png>
/*Anybody has any advice about this ?
Thanks*/
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-exectors-mem
spark.yarn.executor.memoryOverhead.
```
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-exectors-memory-increasing-and-executor-killed-by-yarn-tp28500p28506.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
In this kind of question, you always want to tell us the spark version.
Yong
From: darin <lidal...@foxmail.com>
Sent: Thursday, March 16, 2017 9:59 PM
To: user@spark.apache.org
Subject: spark streaming exectors memory increasing and executor killed by ya
park-user-list.1001560.n3.nabble.com/file/n28500/QQ20170317-095331%402x.png>
/*Anybody has any advice about this ?
Thanks*/
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-exectors-mem