Hi,

Can these be clarified please


   1. Can a YARN container use more than one core and if this is
   configurable?
   2. A YARN container is constraint to 8MB by
   " yarn.scheduler.maximum-allocation-mb". If a YARN container is a Spark
   process will that limit also include the memory Spark going to be using?

Thanks,

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com

Reply via email to