FYI: I've been working on stabilizing tests on streaming join and Kafka
continuous mode (they're somewhat coupled with - Kafka continuous mode
fails after porting back commit on streaming join) for branch-2.3, and I
think it's done. https://github.com/apache/spark/pull/23757

2019년 2월 11일 (월) 오전 5:08, Sean Owen <sro...@gmail.com>님이 작성:

> The HiveExternalCatalogVersionsSuite is hard to make robust as it
> downloads several huge Spark archives. It does try several mirrors and
> fall back to archive.apache, but, still, plenty of scope for
> occasional errors. We need to keep this restricted to only testing a
> few recent Spark versions.
>
> On Sun, Feb 10, 2019 at 2:01 PM Felix Cheung <felixcheun...@hotmail.com>
> wrote:
> >
> > +1
> > See note
> >
> > Tested build from source and running tests.
> > Also tested SparkR basic - ran more tests in RC1 and checked there was
> no change in R since. So I’m ok with that.
> >
> > Note:
> > 1. Opened https://issues.apache.org/jira/browse/SPARK-26855 on the
> SparkSubmitSuite failure - (thanks to Sean’s tip) I don’t think it’s
> blocker.
> >
> > 2. Ran into a failure in HiveExternalCatalogVersionsSuite. But passed
> the 2nd ran (How reliable is archive.apache? It failed for
> > me before)
> > WARN org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite: Failed
> to download Spark 2.3.2 from
> https://archive.apache.org/dist/spark/spark-2.3.2/spark-2.3.2-bin-hadoop2.7.tgz:
> Socket closed
> > org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite *** ABORTED
> ***
> > Exception encountered when invoking run on a nested suite - Unable to
> download Spark 2.3.2 (HiveExternalCatalogVersionsSuite.scala:97)
> >
> > 3. There are a fair bit of changes in Python and SQL - someone should
> test that
> >
> > 4. Last time k8s integration tests is broken before it isn’t built by
> default. Could someone test with -Pkubernetes -Pkubernetes-integration-tests
> >
> > SPARK-26482 broke the integration tests
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to