Al M created SPARK-5768:
---------------------------

             Summary: Spark UI Shows incorrect memory under Yarn
                 Key: SPARK-5768
                 URL: https://issues.apache.org/jira/browse/SPARK-5768
             Project: Spark
          Issue Type: Bug
          Components: YARN
    Affects Versions: 1.2.1, 1.2.0
         Environment: Centos 6
            Reporter: Al M
            Priority: Trivial


I am running Spark on Yarn with 2 executors.  The executors are running on 
separate physical machines.

I have spark.executor.memory set to '40g'.  This is because I want to have 40g 
of memory used on each machine.  I have one executor per machine.

When I run my application I see from 'top' that both my executors are using the 
full 40g of memory I allocated to them.

The 'Executors' tab in the Spark UI shows something different.  It shows the 
memory used as a total of 20GB per executor e.g. x / 20.3GB.  This makes it 
look like I only have 20GB available per executor when really I have 40GB 
available.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to