I just retargeted SPARK-16011 to 2.1.

On Fri, Jul 15, 2016 at 10:43 AM, Shivaram Venkataraman <
shiva...@eecs.berkeley.edu> wrote:

> Hashes, sigs match. I built and ran tests with Hadoop 2.3 ("-Pyarn
> -Phadoop-2.3 -Phive -Pkinesis-asl -Phive-thriftserver"). I couldn't
> get the following tests to pass but I think it might be something
> specific to my setup as Jenkins on branch-2.0 seems quite stable.
>
> [error] Failed tests:
> [error] org.apache.spark.sql.hive.client.VersionsSuite
> [error] org.apache.spark.sql.hive.HiveSparkSubmitSuite
> [error] Error during tests:
> [error] org.apache.spark.sql.hive.HiveExternalCatalogSuite
>
> Regarding the open issues, I agree with Sean that most of them seem
> minor to me and not worth blocking a release for. It would be good to
> get more details on SPARK-16011 though
>
> As for the docs, ideally we should have them in place before the RC
> but given that this is a recurring issue I'm wondering if having a
> separate updatable link (like the 2.0.0-rc4-updated that Reynold
> posted yesterday) can be used. The semantics we could then have are
> that the docs should be ready when the vote succeeds rather than being
> ready when the vote starts.
>
> Thanks
> Shivaram
>
> On Fri, Jul 15, 2016 at 6:59 AM, Sean Owen <so...@cloudera.com> wrote:
> > Signatures and hashes are OK. I built and ran tests successfully on
> > Ubuntu 16 + Java 8 with "-Phive -Phadoop-2.7 -Pyarn". Although I
> > encountered a few tests failures, none were repeatable.
> >
> > Regarding other issues brought up so far:
> >
> > SPARK-16522
> > Does not seem quite enough to be a blocker if it's just an error at
> > shutdown that does not affect the result. If there's another RC, worth
> > fixing.
> > SPARK-15899
> > Not a blocker. Only affects Windows and possibly even only affects
> > tests. Not a regression.
> > SPARK-16515
> > Not sure but Cheng please mark it a Blocker if you're pretty confident
> > it must be fixed.
> >
> > Davies marked SPARK-16011 a Blocker, though should confirm that it's
> > for 2.0.0. That's the only one officially open now.
> >
> > So I suppose that's provisionally a -1 from me as it's not clear there
> > aren't blocking issues. It's close, and this should be tested by
> > everyone.
> >
> >
> > Remaining Critical issues are below. I'm still uncomfortable with
> > documentation issues for 2.0 not being done before 2.0. If anyone's
> > intent is to release and then finish the docs a few days later, I'd
> > vote against that. There's just no rush that makes that make sense.
> >
> > However it's entirely possible that the remaining work is not
> > essential for 2.0; I don't know. These should be retitled then. But to
> > make this make sense, one or the other needs to happen. "Audit" JIRAs
> > are similar, especially before a major release.
> >
> >
> > SPARK-13393 Column mismatch issue in left_outer join using Spark
> DataFrame
> > SPARK-13753 Column nullable is derived incorrectly
> > SPARK-13959 Audit MiMa excludes added in SPARK-13948 to make sure none
> > are unintended incompatibilities
> > SPARK-14808 Spark MLlib, GraphX, SparkR 2.0 QA umbrella
> > SPARK-14816 Update MLlib, GraphX, SparkR websites for 2.0
> > SPARK-14817 ML, Graph, R 2.0 QA: Programming guide update and migration
> guide
> > SPARK-14823 Fix all references to HiveContext in comments and docs
> > SPARK-15340 Limit the size of the map used to cache JobConfs to void OOM
> > SPARK-15393 Writing empty Dataframes doesn't save any _metadata files
> > SPARK-15703 Spark UI doesn't show all tasks as completed when it should
> > SPARK-15944 Make spark.ml package backward compatible with spark.mllib
> vectors
> > SPARK-16032 Audit semantics of various insertion operations related to
> > partitioned tables
> > SPARK-16090 Improve method grouping in SparkR generated docs
> > SPARK-16301 Analyzer rule for resolving using joins should respect
> > case sensitivity setting
> >
> > On Thu, Jul 14, 2016 at 7:59 PM, Reynold Xin <r...@databricks.com>
> wrote:
> >> Please vote on releasing the following candidate as Apache Spark version
> >> 2.0.0. The vote is open until Sunday, July 17, 2016 at 12:00 PDT and
> passes
> >> if a majority of at least 3 +1 PMC votes are cast.
> >>
> >> [ ] +1 Release this package as Apache Spark 2.0.0
> >> [ ] -1 Do not release this package because ...
> >>
> >>
> >> The tag to be voted on is v2.0.0-rc4
> >> (e5f8c1117e0c48499f54d62b556bc693435afae0).
> >>
> >> This release candidate resolves ~2500 issues:
> >> https://s.apache.org/spark-2.0.0-jira
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-bin/
> >>
> >> Release artifacts are signed with the following key:
> >> https://people.apache.org/keys/committer/pwendell.asc
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1192/
> >>
> >> The documentation corresponding to this release can be found at:
> >> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-docs/
> >>
> >>
> >> =================================
> >> How can I help test this release?
> >> =================================
> >> If you are a Spark user, you can help us test this release by taking an
> >> existing Spark workload and running on this release candidate, then
> >> reporting any regressions from 1.x.
> >>
> >> ==========================================
> >> What justifies a -1 vote for this release?
> >> ==========================================
> >> Critical bugs impacting major functionalities.
> >>
> >> Bugs already present in 1.x, missing features, or bugs related to new
> >> features will not necessarily block this release. Note that historically
> >> Spark documentation has been published on the website separately from
> the
> >> main release so we do not need to block the release due to documentation
> >> errors either.
> >>
> >>
> >> Note: There was a mistake made during "rc3" preparation, and as a result
> >> there is no "rc3", but only "rc4".
> >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
>

Reply via email to