Engineering *
>> *o:* 646.759.0052
>>
>> * <http://www.magnetic.com/>*
>>
>> On Mon, Oct 5, 2015 at 11:06 AM, Andreas Fritzler <
>> andreas.fritz...@gmail.com> wrote:
>>
>>> Hi Steve, Alex,
>>>
>>> how do you handle the distri
Hi Steve, Alex,
how do you handle the distribution and configuration of
the spark-*-yarn-shuffle.jar on your NodeManagers if you want to use 2
different Spark versions?
Regards,
Andreas
On Mon, Oct 5, 2015 at 4:54 PM, Steve Loughran
wrote:
>
> > On 5 Oct 2015, at 16:48, Alex Rovner wrote:
> >
Hi,
I was just wondering, if it is possible to register multiple versions of
the aux-services with YARN as described in the documentation:
1. In the yarn-site.xml on each node, add spark_shuffle to
yarn.nodemanager.aux-services, then set
yarn.nodemanager.aux-services.spark_shuffle.clas
--name "My app name"
> --jars lib1.jar,lib2.jar
> --deploy-mode cluster
> app.jar
>
> Both YARN and standalone modes support client and cluster modes, and the
> spark-submit script is the common interface through which you can launch
> your application. In ot
Hi all,
when runnig the Spark cluster in standalone mode I am able to create the
Spark context from Java via the following code snippet:
SparkConf conf = new SparkConf()
>.setAppName("MySparkApp")
>.setMaster("spark://SPARK_MASTER:7077")
>.setJars(jars);
> JavaSparkContext sc = new Ja