> correct. note that you only need to install spark on the node you launch it
> from. spark doesnt need to be installed on cluster itself.

That sound reasonably doable for me. My guess is I will have some
troubles to make that spark version work with both hive & hdfs installed
on the cluster - or maybe that's finally plug-&-play i don't know.

thanks

On Mon, May 20, 2019 at 02:16:43PM -0400, Koert Kuipers wrote:
> correct. note that you only need to install spark on the node you launch it
> from. spark doesnt need to be installed on cluster itself.
> 
> the shared components between spark jobs on yarn are only really
> spark-shuffle-service in yarn and spark-history-server. i have found
> compatibility for these to be good. its best if these run latest version.
> 
> On Mon, May 20, 2019 at 2:02 PM Nicolas Paris <nicolas.pa...@riseup.net> 
> wrote:
> 
>     > you will need the spark version you intend to launch with on the machine
>     you
>     > launch from and point to the correct spark-submit
> 
>     does this mean to install a second spark version (2.4) on the cluster ?
> 
>     thanks
> 
>     On Mon, May 20, 2019 at 01:58:11PM -0400, Koert Kuipers wrote:
>     > yarn can happily run multiple spark versions side-by-side
>     > you will need the spark version you intend to launch with on the machine
>     you
>     > launch from and point to the correct spark-submit
>     >
>     > On Mon, May 20, 2019 at 1:50 PM Nicolas Paris <nicolas.pa...@riseup.net>
>     wrote:
>     >
>     >     Hi
>     >
>     >     I am wondering whether that's feasible to:
>     >     - build a spark application (with sbt/maven) based on spark2.4
>     >     - deploy that jar on yarn on a spark2.3 based installation
>     >
>     >     thanks by advance,
>     >
>     >
>     >     --
>     >     nicolas
>     >
>     >     
> ---------------------------------------------------------------------
>     >     To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>     >
>     >
> 
>     --
>     nicolas
> 
>     ---------------------------------------------------------------------
>     To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 
> 

-- 
nicolas

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to