Hi there,
I am currently using Spark cluster to run jobs but I really need to collect the
history of actually memory usage(that’s execution memory + storage memory) of
the job in the whole cluster. I know we can get the storage memory usage
through either Spark UI Executor page or SparkContext.getMemoryExecutorStatus()
API, but I could not get the real time execution memory usage.
Is there anyway I can collect total memory usage? Thank you so much!
Best,
Jialin Liu
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org