Re: reading info from spark 2.0 application UI

2016-10-24 Thread Sean Owen
What matters in this case is how many vcores YARN thinks it can allocate per machine. I think the relevant setting is yarn.nodemanager.resource.cpu-vcores. I bet you'll find this is actually more than the machine's number of cores, possibly on purpose, to enable some over-committing. On Mon, Oct 2

Re: reading info from spark 2.0 application UI

2016-10-24 Thread Sean Owen
If you're really sure that 4 executors are on 1 machine, then it means your resource manager allowed it. What are you using, YARN? check that you really are limited to 40 cores per machine in the YARN config. On Mon, Oct 24, 2016 at 3:33 PM TheGeorge1918 . wrote: > Hi all, > > I'm deeply confuse