-1 (non-binding)

I agree with Jungtaek. The change to create datasource tables instead of
Hive tables by default (no USING or STORED AS clauses) has created
confusing behavior and should either be rolled back or fixed before 3.0.

On Wed, Apr 1, 2020 at 5:12 AM Sean Owen <sro...@gmail.com> wrote:

> Those are not per se release blockers. They are (perhaps important)
> improvements to functionality. I don't know who is active and able to
> review that part of the code; I'd look for authors of changes in the
> surrounding code. The question here isn't so much what one would like
> to see in this release, but evaluating whether the release is sound
> and free of show-stopper problems. There will always be potentially
> important changes and fixes to come.
>
> On Wed, Apr 1, 2020 at 5:31 AM Dr. Kent Yao <yaooq...@qq.com> wrote:
> >
> > -1
> > Do not release this package because v3.0.0 is the 3rd major release
> since we
> > added Spark On Kubernetes. Can we make it more production-ready as it has
> > been experimental for more than 2 years?
> >
> > The main practical adoption of Spark on Kubernetes is to take on the
> role of
> > other cluster managers(mainly YARN). And the storage layer(mainly HDFS)
> > would be more likely kept anyway. But Spark on Kubernetes with HDFS seems
> > not to work properly.
> >
> > e.g.
> > This ticket and PR were submitted 7 months ago, and never get reviewed.
> > https://issues.apache.org/jira/browse/SPARK-29974
> > https://issues.apache.org/jira/browse/SPARK-28992
> > https://github.com/apache/spark/pull/25695
> >
> > And this.
> > https://issues.apache.org/jira/browse/SPARK-28896
> > https://github.com/apache/spark/pull/25609
> >
> > In terms of how often this module is updated, it seems to be stable.
> > But in terms of how often PRs for this module are reviewed, it seems
> that it
> > will stay experimental for a long time.
> >
> > Thanks.
> >
> >
> >
> > --
> > Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
Ryan Blue
Software Engineer
Netflix

Reply via email to