> you will need the spark version you intend to launch with on the machine you
> launch from and point to the correct spark-submit

does this mean to install a second spark version (2.4) on the cluster ?

thanks

On Mon, May 20, 2019 at 01:58:11PM -0400, Koert Kuipers wrote:
> yarn can happily run multiple spark versions side-by-side
> you will need the spark version you intend to launch with on the machine you
> launch from and point to the correct spark-submit
> 
> On Mon, May 20, 2019 at 1:50 PM Nicolas Paris <nicolas.pa...@riseup.net> 
> wrote:
> 
>     Hi
> 
>     I am wondering whether that's feasible to:
>     - build a spark application (with sbt/maven) based on spark2.4
>     - deploy that jar on yarn on a spark2.3 based installation
> 
>     thanks by advance,
> 
> 
>     --
>     nicolas
> 
>     ---------------------------------------------------------------------
>     To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 
> 

-- 
nicolas

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to