I'm sorry, but I have to agree with Ryan and Russell. I chose the option 1
because it's less worse than option 2, but it doesn't mean I fully agree
with option 1.

Let's make below things clear if we really go with option 1, otherwise
please consider reverting it.

* Do you fully indicate about "all" the paths where the second create table
syntax is taken?
* Could you explain "why" to end users without any confusion? Do you think
end users will understand it easily?
* Do you have an actual end users to guide to turn this on? Or do you have
a plan to turn this on for your team/customers and deal with the ambiguity?
* Could you please document about how things will change if the flag is
turned on?

I guess the option 1 is to leave a flag as "undocumented" one and forget
about the path to turn on, but I think that would lead to make the
feature be "broken window" even we are not able to touch.

On Tue, May 12, 2020 at 6:45 AM Russell Spitzer <russell.spit...@gmail.com>
wrote:

> I think reverting 30098 is the right decision here if we want to unblock
> 3.0. We shouldn't ship with features which we know do not function in the
> way we intend, regardless of how little exposure most users have to them.
> Even if it's off my default, we should probably work to avoid switches that
> cause things to behave unpredictably or require a flow chart to actually
> determine what will happen.
>
> On Mon, May 11, 2020 at 3:07 PM Ryan Blue <rb...@netflix.com.invalid>
> wrote:
>
>> I'm all for fixing behavior in master by turning this off as an
>> intermediate step, but I don't think that Spark 3.0 can safely include
>> SPARK-30098.
>>
>> The problem is that SPARK-30098 introduces strange behavior, as Jungtaek
>> pointed out. And that behavior is not fully understood. While working on a
>> unified CREATE TABLE syntax, I hit additional test failures
>> <https://github.com/apache/spark/pull/28026#issuecomment-606967363>
>> where the wrong create path was being used.
>>
>> Unless we plan to NOT support the behavior
>> when spark.sql.legacy.createHiveTableByDefault.enabled is disabled, we
>> should not ship Spark 3.0 with SPARK-30098. Otherwise, we will have to deal
>> with this problem for years to come.
>>
>> On Mon, May 11, 2020 at 1:06 AM JackyLee <qcsd2...@163.com> wrote:
>>
>>> +1. Agree with Xiao Li and Jungtaek Lim.
>>>
>>> This seems to be controversial, and can not be done in a short time. It
>>> is
>>> necessary to choose option 1 to unblock Spark 3.0 and support it in 3.1.
>>>
>>>
>>>
>>> --
>>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>
>>
>> --
>> Ryan Blue
>> Software Engineer
>> Netflix
>>
>

Reply via email to