If this is not an expected behavior then its should be logged as an issue. On Tue, Jan 3, 2017 at 2:51 PM, Nirav Patel <npa...@xactlycorp.com> wrote:
> When enabling dynamic scheduling I see that all executors are using only 1 > core even if I specify "spark.executor.cores" to 6. If dynamic scheduling > is disable then each executors will have 6 cores. I have tested this > against spark 1.5 . I wonder if this is the same behavior with 2.x as well. > > Thanks > > -- [image: What's New with Xactly] <http://www.xactlycorp.com/email-click/> <https://www.nyse.com/quote/XNYS:XTLY> [image: LinkedIn] <https://www.linkedin.com/company/xactly-corporation> [image: Twitter] <https://twitter.com/Xactly> [image: Facebook] <https://www.facebook.com/XactlyCorp> [image: YouTube] <http://www.youtube.com/xactlycorporation>