Thanks Sandy, it is very useful!


bit1...@163.com
 
From: Sandy Ryza
Date: 2015-04-29 15:24
To: bit1...@163.com
CC: user
Subject: Re: Question about Memory Used and VCores Used
Hi,

Good question.  The extra memory comes from spark.yarn.executor.memoryOverhead, 
the space used for the application master, and the way the YARN rounds requests 
up.  This explains it in a little more detail:
http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/

-Sandy

On Tue, Apr 28, 2015 at 7:12 PM, bit1...@163.com <bit1...@163.com> wrote:
Hi,guys,
I have the following computation with 3 workers:
spark-sql --master yarn --executor-memory 3g --executor-cores 2 --driver-memory 
1g -e 'select count(*) from table'

The resources used are shown as below on the UI:
I don't understand why the memory used is 15GB and vcores used is 5. I think 
the memory used should be executor-memory*numOfWorkers=3G*3=9G, and the Vcores 
used shoulde be executor-cores*numOfWorkers=6

Can you please explain the result?Thanks.





bit1...@163.com

邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。
共有 1 个附件
Catch.jpg(16K) 极速下载 在线预览 

Reply via email to