Re: [DISCUSS] Spark Columnar Processing

2019-04-05 Thread Bobby Evans
I just filed SPARK-27396 as the SPIP for this proposal. Please use that JIRA for further discussions. Thanks for all of the feedback, Bobby On Wed, Apr 3, 2019 at 7:15 PM Bobby Evans wrote: > I am still working on the SPIP and should get it up in the next few days. > I have the basic text

Re: Spark 2.4.0 tests fail with hadoop-3.1 profile: NoClassDefFoundError org.apache.hadoop.hive.conf.HiveConf

2019-04-05 Thread Sean Owen
Hadoop 3 isn't supported yet, not quite even in master. I think the profile there exists for testing at the moment. Others may know a way that it can work but don't think it would out of the box. On Fri, Apr 5, 2019 at 12:53 PM akirillov wrote: > > Hi there! I'm trying to run Spark unit tests

Re: Spark 2.4.0 tests fail with hadoop-3.1 profile: NoClassDefFoundError org.apache.hadoop.hive.conf.HiveConf

2019-04-05 Thread Marcelo Vanzin
You can always try. But Hadoop 3 is not yet supported by Spark. On Fri, Apr 5, 2019 at 11:13 AM Anton Kirillov wrote: > > Marcelo, Sean, thanks for the clarification. So in order to support Hadoop 3+ > the preferred way would be to use Hadoop-free builds and provide Hadoop > dependencies in

Re: Spark 2.4.0 tests fail with hadoop-3.1 profile: NoClassDefFoundError org.apache.hadoop.hive.conf.HiveConf

2019-04-05 Thread Sean Owen
Yes, you can try it, though I doubt that will 100% work. Have a look at the "hadoop 3" JIRAs and PRs still in progress on master. On Fri, Apr 5, 2019 at 1:14 PM Anton Kirillov wrote: > > Marcelo, Sean, thanks for the clarification. So in order to support Hadoop 3+ > the preferred way would be

Re: Spark 2.4.0 tests fail with hadoop-3.1 profile: NoClassDefFoundError org.apache.hadoop.hive.conf.HiveConf

2019-04-05 Thread Anton Kirillov
Marcelo, Sean, thanks for the clarification. So in order to support Hadoop 3+ the preferred way would be to use Hadoop-free builds and provide Hadoop dependencies in the classpath, is that correct? On Fri, Apr 5, 2019 at 10:57 AM Marcelo Vanzin wrote: > The hadoop-3 profile doesn't really work

Re: Spark 2.4.0 tests fail with hadoop-3.1 profile: NoClassDefFoundError org.apache.hadoop.hive.conf.HiveConf

2019-04-05 Thread Marcelo Vanzin
The hadoop-3 profile doesn't really work yet, not even on master. That's being worked on still. On Fri, Apr 5, 2019 at 10:53 AM akirillov wrote: > > Hi there! I'm trying to run Spark unit tests with the following profiles: > > And 'core' module fails with the following test failing with >

Re: Spark 2.4.0 tests fail with hadoop-3.1 profile: NoClassDefFoundError org.apache.hadoop.hive.conf.HiveConf

2019-04-05 Thread Anton Kirillov
Really sorry for the formatting. Here's the original message: Hi there! I'm trying to run Spark unit tests with the following profiles: ./build/mvn test -Pmesos "-Phadoop-3.1" -Pnetlib-lgpl -Psparkr -Phive -Phive-thriftserver And 'core' module fails with the following test failing with

Spark 2.4.0 tests fail with hadoop-3.1 profile: NoClassDefFoundError org.apache.hadoop.hive.conf.HiveConf

2019-04-05 Thread akirillov
Hi there! I'm trying to run Spark unit tests with the following profiles: And 'core' module fails with the following test failing with NoClassDefFoundError: In the meantime building a distribution works fine when running: Also, there are no problems with running tests using Hadoop 2.7

Re: [ANNOUNCE] Announcing Apache Spark 2.4.1

2019-04-05 Thread Andrew Melo
On Fri, Apr 5, 2019 at 9:41 AM Jungtaek Lim wrote: > > Thanks Andrew for reporting this. I just submitted the fix. > https://github.com/apache/spark/pull/24304 Thanks! > > On Fri, Apr 5, 2019 at 3:21 PM Andrew Melo wrote: >> >> Hello, >> >> I'm not sure if this is the proper place to report

Re: [ANNOUNCE] Announcing Apache Spark 2.4.1

2019-04-05 Thread Jungtaek Lim
Thanks Andrew for reporting this. I just submitted the fix. https://github.com/apache/spark/pull/24304 On Fri, Apr 5, 2019 at 3:21 PM Andrew Melo wrote: > Hello, > > I'm not sure if this is the proper place to report it, but the 2.4.1 > version of the config docs apparently didn't render right

Re: [ANNOUNCE] Announcing Apache Spark 2.4.1

2019-04-05 Thread Andrew Melo
Hello, I'm not sure if this is the proper place to report it, but the 2.4.1 version of the config docs apparently didn't render right into HTML (scroll down to "Compression and Serialization") https://spark.apache.org/docs/2.4.1/configuration.html#available-properties By comparison, the 2.4.0

[ANNOUNCE] Announcing Apache Spark 2.4.1

2019-04-05 Thread DB Tsai
+user list We are happy to announce the availability of Spark 2.4.1! Apache Spark 2.4.1 is a maintenance release, based on the branch-2.4 maintenance branch of Spark. We strongly recommend all 2.4.0 users to upgrade to this stable release. In Apache Spark 2.4.1, Scala 2.12 support is GA, and