At a glance, it doesn't seem so. That is a corner case in two ways - very
old dates and using RDDs, at least it seems.
I also suspect that individual change is tied to a lot of other date
related changes in 3.2, so may not be very back-portable.
You should pursue updating to 3.2 for many reasons, but this too if it
affects you.

On Tue, Feb 1, 2022 at 1:50 AM Gaspar Muñoz <gmu...@datiobd.com> wrote:

> it looks that this commit (
> https://github.com/apache/spark/commit/a85490659f45410be3588c669248dc4f534d2a71)
> do the trick.
>
> [image: image.png]
>
> Don't you think, this bug is enough important to incluide in 3.1 branch?
>
> Regards
>
> El jue, 20 ene 2022 a las 8:55, Gaspar Muñoz (<gmu...@datiobd.com>)
> escribió:
>
>> Hi guys,
>>
>> hundreds of spark jobs run on my company every day. We are running Spark
>> 3.1.2 and we want enable Adaptive Query Execution (AQE) for all of them.
>> We can't upgrade to 3.2 right now so we want enable it explicitly using
>> appropriate conf when spark submit.
>>
>> Some of them fails when enable AQE but I can't discover what is
>> happening.  In order to give your information I prepared a small snippet
>> for spark shell that fails in Spark 3.1 when AQE enabled and works when
>> disabled. It also work in 3.2 but I think maybe is a bug that can be fixed
>> for 3.1.3.
>>
>> The code and explanation can be found here:
>> https://issues.apache.org/jira/browse/SPARK-37898
>>
>> Regards
>> --
>> Gaspar Muñoz Soria
>>
>
>
> --
> Gaspar Muñoz Soria
>
> Vía de las dos Castillas, 33, Ática 4, 3ª Planta
> 28224 Pozuelo de Alarcón, Madrid
> Tel: +34 91 828 6473
>

Reply via email to