ENV:
Spark:0.9.0-incubating
Hadoop:2.3.0
I run spark task on Yarn. I see the log in Nodemanager:
2014-09-25 17:43:34,141 INFO
org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl:
Memory usage of ProcessTree 549 for container-id
container_1411635522254_0001_01_000005: *4.5 GB**of 5 GB* physical
memory used; 5.0 GB of 10.5 GB virtual memory used
2014-09-25 17:43:37,171 INFO
org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl:
Memory usage of ProcessTree 549 for container-id
container_1411635522254_0001_01_000005: *4.5 GB**of 5 GB* physical
memory used; 5.0 GB of 10.5 GB virtual memory used
2014-09-25 17:43:40,210 INFO
org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl:
Memory usage of ProcessTree 549 for container-id
container_1411635522254_0001_01_000005: *4.5 GB**of 5 GB* physical
memory used; 5.0 GB of 10.5 GB virtual memory used
2014-09-25 17:43:43,239 INFO
org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl:
Memory usage of ProcessTree 549 for container-id
container_1411635522254_0001_01_000005: *4.5 GB of 5 GB* physical memory
used; 5.0 GB of 10.5 GB virtual memory used
2014-09-25 17:43:46,269 INFO
org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl:
Memory usage of ProcessTree 549 for container-id
container_1411635522254_0001_01_000005: *4.5 GB of 5 GB* physical memory
used; 5.0 GB of 10.5 GB virtual memory used
My task parameter is :
--num-workers 4 --master-memory 2g --worker-memory 4g --worker-cores 4
In myopinion "--worker-memory 4g" 4g is the maximum memory for container .
But why "4.5 GB of 5 GB physical memory used" in the log?
And where to config "5G" maxinum memory for container?
--
WangXiaoyu