@Mich Talebzadeh <mich.talebza...@gmail.com> thanks for sharing your
concern!

Note: creating Spark native data source tables is usually Hive compatible
as well, unless we use features that Hive does not support (TIMESTAMP NTZ,
ANSI INTERVAL, etc.). I think it's a better default to create Spark native
table in this case, instead of creating Hive table and fail.

On Sat, Apr 27, 2024 at 12:46 PM Cheng Pan <pan3...@gmail.com> wrote:

> +1 (non-binding)
>
> Thanks,
> Cheng Pan
>
> On Sat, Apr 27, 2024 at 9:29 AM Holden Karau <holden.ka...@gmail.com>
> wrote:
> >
> > +1
> >
> > Twitter: https://twitter.com/holdenkarau
> > Books (Learning Spark, High Performance Spark, etc.):
> https://amzn.to/2MaRAG9
> > YouTube Live Streams: https://www.youtube.com/user/holdenkarau
> >
> >
> > On Fri, Apr 26, 2024 at 12:06 PM L. C. Hsieh <vii...@gmail.com> wrote:
> >>
> >> +1
> >>
> >> On Fri, Apr 26, 2024 at 10:01 AM Dongjoon Hyun <dongj...@apache.org>
> wrote:
> >> >
> >> > I'll start with my +1.
> >> >
> >> > Dongjoon.
> >> >
> >> > On 2024/04/26 16:45:51 Dongjoon Hyun wrote:
> >> > > Please vote on SPARK-46122 to set
> spark.sql.legacy.createHiveTableByDefault
> >> > > to `false` by default. The technical scope is defined in the
> following PR.
> >> > >
> >> > > - DISCUSSION:
> >> > > https://lists.apache.org/thread/ylk96fg4lvn6klxhj6t6yh42lyqb8wmd
> >> > > - JIRA: https://issues.apache.org/jira/browse/SPARK-46122
> >> > > - PR: https://github.com/apache/spark/pull/46207
> >> > >
> >> > > The vote is open until April 30th 1AM (PST) and passes
> >> > > if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >> > >
> >> > > [ ] +1 Set spark.sql.legacy.createHiveTableByDefault to false by
> default
> >> > > [ ] -1 Do not change spark.sql.legacy.createHiveTableByDefault
> because ...
> >> > >
> >> > > Thank you in advance.
> >> > >
> >> > > Dongjoon
> >> > >
> >> >
> >> > ---------------------------------------------------------------------
> >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >> >
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to