Thank you, Sean and all.
One decision was made swiftly today.
I believe that we can move forward case-by-case for the others until the
feature freeze (3.0 branch cut).
Bests,
Dongjoon.
On Mon, Jul 8, 2019 at 13:03 Marco Gaido wrote:
> Hi Sean,
>
> Thanks for bringing this up. Honestly, my
Hi Sean,
Thanks for bringing this up. Honestly, my opinion is that Spark should be
fully ANSI SQL compliant. Where ANSI SQL compliance is not an issue, I am
fine following any other DB. IMHO, we won't get anyway 100% compliance with
any DB - postgres in this case (e.g. for decimal operations, we
See the particular issue / question at
https://github.com/apache/spark/pull/24872#issuecomment-509108532 and
the larger umbrella at
https://issues.apache.org/jira/browse/SPARK-27764 -- Dongjoon rightly
suggests this is a broader question.