Hi Jorge,

Here we are using an apache hadoop instalation, and to run multiple
versions we just need to change the submit in the client using the correct
spark version you need.

$SPARK_HOME/bin/spark-submit

and pass the correct Spark libs in the conf.

For spark 2.0.0

--conf spark.yarn.archive=

For previously versions

--conf spark.yarn.jar=



Tiago



Tiago Albineli Motta
Desenvolvedor de Software - Globo.com
ICQ: 32107100
http://programandosemcafeina.blogspot.com

On Sat, Dec 17, 2016 at 5:36 AM, Jorge Machado <jom...@me.com> wrote:

> Hi Everyone,
>
> I have one question : is it possible to run like on HDP Spark 1.6.1 and
> then run Spark 2.0.0 inside of it ?
> Like passing the spark libs with —jars ? The Ideia behind it is not to
> need to use the default installation of HDP and be able to deploy new
> versions of spark quickly.
>
> Thx
>
> Jorge Machado
>
>
>
>
>
>
>

Reply via email to