I am using spark 1.2, and I see a lot of messages like:

ExternalSorter: Thread 66 spilling in-memory map of 5.0 MB to disk (13160
times so far)

I seem to have a lot of memory:

URL: spark://hadoop-m:7077
Workers: 4
Cores: 64 Total, 64 Used
Memory: 328.0 GB Total, 327.0 GB Used
_____________________

Executors (4)
Memory: 3.5 GB Used (114.8 GB Total)

Executor ID     Address RDD Blocks      Memory Used
0       hadoop-w-1      176     2.8 GB / 28.8 GB
1       hadoop-w-0      42      680.9 MB / 28.8 GB
2       hadoop-w-2      0       0.0 B / 28.8 GB
<driver>        hadoop-w-3      0       0.0 B / 28.4 GB

Also,  I am not sure why hadoop-w-2 is in "Loading" state?

Thanks,

Ami





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/ExternalSorter-spilling-in-memory-map-tp21125.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to