apache.org<mailto:user@spark.apache.org>
Subject: Re: [Spark] Does Spark support backward and forward compatibility?
I think/hope that it goes without saying you can't mix Spark versions within a
cluster.
Forwards compatibility is something you don't generally expect as a defau
)
From: Sean Owen<mailto:sro...@gmail.com>
Sent: Wednesday, November 24, 2021 10:48 PM
To: Amin Borjian<mailto:borjianami...@outlook.com>
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: [Spark] Does Spark support backward and forward compatibility?
I think
I
misunderstood)
*From: *Sean Owen <mailto:sro...@gmail.com>
*Sent: *Wednesday, November 24, 2021 5:38 PM
*To: *Amin Borjian <mailto:borjianami...@outlook.com>
*Cc: *user@spark.apache.org
*Subject: *Re: [Spark] Does Spark support backward and forward
compatibi
3.1.x? Does it client can work with newer cluster version because
> it uses just old feature of severs? (Maybe you mean this and in fact my
> previous sentence was wrong and I misunderstood)
>
>
>
> *From: *Sean Owen
> *Sent: *Wednesday, November 24, 2021 5:38 PM
> *To
sro...@gmail.com>
Sent: Wednesday, November 24, 2021 5:38 PM
To: Amin Borjian<mailto:borjianami...@outlook.com>
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: [Spark] Does Spark support backward and forward compatibility?
Can you mix different Spark versions on drive
Can you mix different Spark versions on driver and executor? no.
Can you compile against a different version of Spark than you run on? That
typically works within a major release, though forwards compatibility may
not work (you can't use a feature that doesn't exist in the version on the
cluster).