Please check the document added by Andrew. I could run tasks with Spark 1.2.0.
*
https://github.com/apache/spark/pull/3731/files#diff-c3cbe4cabe90562520f22d2306aa9116R86
*
https://github.com/apache/spark/pull/3757/files#diff-c3cbe4cabe90562520f22d2306aa9116R101
Thanks,
- Tsuyoshi
On Sun, Jan 4
Hi Anders,
I faced the same issue as you mentioned. Yes, you need to install
spark shuffle plugin for YARN. Please check following PRs which add
doc to enable dynamicAllocation:
https://github.com/apache/spark/pull/3731
https://github.com/apache/spark/pull/3757
I could run Spark on YARN with dyn
Hi,
In addition to the options Sameer Mentioned, we need to enable
external shuffle manager, right?
Thanks,
- Tsuyoshi
On Sat, Dec 13, 2014 at 5:27 AM, Sameer Farooqui wrote:
> Hi,
>
> FYI - There are no Worker JVMs used when Spark is launched under YARN.
> Instead the NodeManager in YARN does