https://issues.apache.org/jira/browse/SPARK-21157

Hi - often times, Spark applications are killed for overrunning available
memory by YARN, Mesos, or the OS. In SPARK-21157, I propose a design for
grabbing and reporting "total memory" usage for Spark executors - that is,
memory usage as visible from the OS, including on-heap and off-heap memory
used by Spark and third party libraries. This builds on many ideas from
SPARK-9103.

I'd really welcome some review and some feedback of this design proposal. I
think this could be a helpful feature for Spark users who are trying to
triage memory usage issues. In the future I'd like to think about reporting
memory usage from third party libraries like Netty, as was originally
proposed in SPARK-9103.

Cheers,
--José

Reply via email to