Re: [Spark] Does Spark support backward and forward compatibility?

2021-11-24 Thread Lalwani, Jayesh
apache.org<mailto:user@spark.apache.org> Subject: Re: [Spark] Does Spark support backward and forward compatibility? I think/hope that it goes without saying you can't mix Spark versions within a cluster. Forwards compatibility is something you don't generally expect as a defau

RE: [Spark] Does Spark support backward and forward compatibility?

2021-11-24 Thread Amin Borjian
) From: Sean Owen<mailto:sro...@gmail.com> Sent: Wednesday, November 24, 2021 10:48 PM To: Amin Borjian<mailto:borjianami...@outlook.com> Cc: user@spark.apache.org<mailto:user@spark.apache.org> Subject: Re: [Spark] Does Spark support backward and forward compatibility? I think

Re: [Spark] Does Spark support backward and forward compatibility?

2021-11-24 Thread Martin Wunderlich
I misunderstood) *From: *Sean Owen <mailto:sro...@gmail.com> *Sent: *Wednesday, November 24, 2021 5:38 PM *To: *Amin Borjian <mailto:borjianami...@outlook.com> *Cc: *user@spark.apache.org *Subject: *Re: [Spark] Does Spark support backward and forward compatibi

Re: [Spark] Does Spark support backward and forward compatibility?

2021-11-24 Thread Sean Owen
3.1.x? Does it client can work with newer cluster version because > it uses just old feature of severs? (Maybe you mean this and in fact my > previous sentence was wrong and I misunderstood) > > > > *From: *Sean Owen > *Sent: *Wednesday, November 24, 2021 5:38 PM > *To

RE: [Spark] Does Spark support backward and forward compatibility?

2021-11-24 Thread Amin Borjian
sro...@gmail.com> Sent: Wednesday, November 24, 2021 5:38 PM To: Amin Borjian<mailto:borjianami...@outlook.com> Cc: user@spark.apache.org<mailto:user@spark.apache.org> Subject: Re: [Spark] Does Spark support backward and forward compatibility? Can you mix different Spark versions on drive

Re: [Spark] Does Spark support backward and forward compatibility?

2021-11-24 Thread Sean Owen
Can you mix different Spark versions on driver and executor? no. Can you compile against a different version of Spark than you run on? That typically works within a major release, though forwards compatibility may not work (you can't use a feature that doesn't exist in the version on the cluster).