to
a newer version of Sparkā is moot. You never do that.
From: Amin Borjian
Date: Wednesday, November 24, 2021 at 2:44 PM
To: Sean Owen
Cc: "user@spark.apache.org"
Subject: RE: [EXTERNAL] [Spark] Does Spark support backward and forward
compatibility?
CAUTION: This email origi
)
From: Sean Owen<mailto:sro...@gmail.com>
Sent: Wednesday, November 24, 2021 10:48 PM
To: Amin Borjian<mailto:borjianami...@outlook.com>
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: [Spark] Does Spark support backward and forward compatibility?
I think
)
*From: *Sean Owen <mailto:sro...@gmail.com>
*Sent: *Wednesday, November 24, 2021 5:38 PM
*To: *Amin Borjian <mailto:borjianami...@outlook.com>
*Cc: *user@spark.apache.org
*Subject: *Re: [Spark] Does Spark support backward and forward
compatibility?
Can you m
client can work with newer cluster version because
> it uses just old feature of severs? (Maybe you mean this and in fact my
> previous sentence was wrong and I misunderstood)
>
>
>
> *From: *Sean Owen
> *Sent: *Wednesday, November 24, 2021 5:38 PM
> *To: *Amin Borjian
sro...@gmail.com>
Sent: Wednesday, November 24, 2021 5:38 PM
To: Amin Borjian<mailto:borjianami...@outlook.com>
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: [Spark] Does Spark support backward and forward compatibility?
Can you mix different Spark versions on d
Can you mix different Spark versions on driver and executor? no.
Can you compile against a different version of Spark than you run on? That
typically works within a major release, though forwards compatibility may
not work (you can't use a feature that doesn't exist in the version on the
cluster).
I have a simple question about using Spark, which although most tools usually
explain this question explicitly (in important text, such as a specific format
or a separate page), I did not find it anywhere. Maybe my search was not
enough, but I thought it was good that I ask this question in the