spark only needs to be present on the machine that launches it using
spark-submit

On Sat, Dec 17, 2016 at 3:59 PM, Jorge Machado <jom...@me.com> wrote:

> Hi Tiago,
>
> thx for the update. Lat question : but this spark-submit that you are
> using need to be on the same version on all yarn hosts ?
> Regards
>
> Jorge Machado
>
>
>
>
>
> On 17 Dec 2016, at 16:46, Tiago Albineli Motta <timo...@gmail.com> wrote:
>
> Hi Jorge,
>
> Here we are using an apache hadoop instalation, and to run multiple
> versions we just need to change the submit in the client using the correct
> spark version you need.
>
> $SPARK_HOME/bin/spark-submit
>
> and pass the correct Spark libs in the conf.
>
> For spark 2.0.0
>
> --conf spark.yarn.archive=
>
> For previously versions
>
> --conf spark.yarn.jar=
>
>
>
> Tiago
>
>
>
> Tiago Albineli Motta
> Desenvolvedor de Software - Globo.com
> ICQ: 32107100
> http://programandosemcafeina.blogspot.com
>
> On Sat, Dec 17, 2016 at 5:36 AM, Jorge Machado <jom...@me.com> wrote:
>
>> Hi Everyone,
>>
>> I have one question : is it possible to run like on HDP Spark 1.6.1 and
>> then run Spark 2.0.0 inside of it ?
>> Like passing the spark libs with —jars ? The Ideia behind it is not to
>> need to use the default installation of HDP and be able to deploy new
>> versions of spark quickly.
>>
>> Thx
>>
>> Jorge Machado
>>
>>
>>
>>
>>
>>
>>
>
>

Reply via email to