Re: Unit test failure in spark-core

2020-10-12 Thread Stephen Coy
Sorry, I forgot: [scoy@Steves-Core-i9-2 core]$ java -version openjdk version "1.8.0_262" OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_262-b10) OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.262-b10, mixed mode) which is on MacOS 10.15.7 On 13 Oct 2020, at 12:47 pm, Stephen Coy

Unit test failure in spark-core

2020-10-12 Thread Stephen Coy
Hi all, When trying to build current master with a simple: mvn clean install I get a consistent unit test failure in core: [ERROR] Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 5.403 s <<< FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite [ERROR]

Re: [UPDATE] Apache Spark 3.1.0 Release Window

2020-10-12 Thread Xiao Li
Thank you, Dongjoon Xiao On Mon, Oct 12, 2020 at 4:19 PM Dongjoon Hyun wrote: > Hi, All. > > Apache Spark 3.1.0 Release Window is adjusted like the following today. > Please check the latest information on the official website. > > - >

[UPDATE] Apache Spark 3.1.0 Release Window

2020-10-12 Thread Dongjoon Hyun
Hi, All. Apache Spark 3.1.0 Release Window is adjusted like the following today. Please check the latest information on the official website. - https://github.com/apache/spark-website/commit/0cd0bdc80503882b4737db7e77cc8f9d17ec12ca - https://spark.apache.org/versioning-policy.html

Re: -Phadoop-provided still includes hadoop jars

2020-10-12 Thread Sean Owen
I don't have a good answer, Steve may know more, but from looking at dependency:tree, it looks mostly like it's hadoop-common that's at issue. Without -Phive it remains 'provided' in the assembly/ module, but -Phive causes it to come back in. Either there's some good reason for that, or, maybe we

-Phadoop-provided still includes hadoop jars

2020-10-12 Thread Kimahriman
When I try to build a distribution with either -Phive or -Phadoop-cloud along with -Phadoop-provided, I still end up with hadoop jars in the distribution. Specifically, with -Phive and -Phadoop-provided, you end up with hadoop-annotations, hadoop-auth, and hadoop-common included in the Spark

Re:Official support of CREATE EXTERNAL TABLE

2020-10-12 Thread 大啊
Personally, I think EXTERNAL is a special feture supported by Hive. If Spark SQL want support it, only consider it for Hive. We only unify `CREATE EXTERNAL TABLE in parser and check for unsupported data sources. At 2020-10-06 22:06:28, "Wenchen Fan" wrote: Hi all, I'd like to start