I think reverting 30098 is the right decision here if we want to unblock
3.0. We shouldn't ship with features which we know do not function in the
way we intend, regardless of how little exposure most users have to them.
Even if it's off my default, we should probably work to avoid switches that
cause things to behave unpredictably or require a flow chart to actually
determine what will happen.

On Mon, May 11, 2020 at 3:07 PM Ryan Blue <rb...@netflix.com.invalid> wrote:

> I'm all for fixing behavior in master by turning this off as an
> intermediate step, but I don't think that Spark 3.0 can safely include
> SPARK-30098.
>
> The problem is that SPARK-30098 introduces strange behavior, as Jungtaek
> pointed out. And that behavior is not fully understood. While working on a
> unified CREATE TABLE syntax, I hit additional test failures
> <https://github.com/apache/spark/pull/28026#issuecomment-606967363> where
> the wrong create path was being used.
>
> Unless we plan to NOT support the behavior
> when spark.sql.legacy.createHiveTableByDefault.enabled is disabled, we
> should not ship Spark 3.0 with SPARK-30098. Otherwise, we will have to deal
> with this problem for years to come.
>
> On Mon, May 11, 2020 at 1:06 AM JackyLee <qcsd2...@163.com> wrote:
>
>> +1. Agree with Xiao Li and Jungtaek Lim.
>>
>> This seems to be controversial, and can not be done in a short time. It is
>> necessary to choose option 1 to unblock Spark 3.0 and support it in 3.1.
>>
>>
>>
>> --
>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>
> --
> Ryan Blue
> Software Engineer
> Netflix
>

Reply via email to