I'm all for fixing behavior in master by turning this off as an
intermediate step, but I don't think that Spark 3.0 can safely include
SPARK-30098.

The problem is that SPARK-30098 introduces strange behavior, as Jungtaek
pointed out. And that behavior is not fully understood. While working on a
unified CREATE TABLE syntax, I hit additional test failures
<https://github.com/apache/spark/pull/28026#issuecomment-606967363> where
the wrong create path was being used.

Unless we plan to NOT support the behavior
when spark.sql.legacy.createHiveTableByDefault.enabled is disabled, we
should not ship Spark 3.0 with SPARK-30098. Otherwise, we will have to deal
with this problem for years to come.

On Mon, May 11, 2020 at 1:06 AM JackyLee <qcsd2...@163.com> wrote:

> +1. Agree with Xiao Li and Jungtaek Lim.
>
> This seems to be controversial, and can not be done in a short time. It is
> necessary to choose option 1 to unblock Spark 3.0 and support it in 3.1.
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
Ryan Blue
Software Engineer
Netflix

Reply via email to