for different nodes). What does the second number signify
(i.e. 8.6 GB and 95.5 GB)? If 17.3 KB was used out of the total memory of
the node, should it not be 17.3 KB/16 GB?
thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Memory-statistics-in-the-Application
-in-the-Application-detail-UI-tp13082.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h
...@spark.incubator.apache.org
Sent: Thursday, August 28, 2014 6:32:32 PM
Subject: Memory statistics in the Application detail UI
Hi,
I am using a cluster where each node has 16GB (this is the executor memory).
After I complete an MLlib job, the executor tab shows the following:
Memory: 142.6 KB
://apache-spark-user-list.1001560.n3.nabble.com/Memory-statistics-in-the-Application-detail-UI-tp13082p13095.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr
Click the Storage tab. You have some (tiny) RDD persisted in memory.
On Fri, Aug 29, 2014 at 5:58 AM, SK skrishna...@gmail.com wrote:
Hi,
Thanks for the responses. I understand that the second values in the Memory
Used column for the executors add up to 95.5 GB and the first values add up
to