Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Rick Moritz
My local build using rc-4 and java 7 does actually also produce different binaries (for one file only) than the 1.4.0 releqse artifact available on Central. These binaries also decompile to identical instructions, but this may be due to different versions of javac (within the 7 family) producing di

Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Sean Owen
Hm... off the cuff I wonder if this is because somehow the build process ran Maven with Java 6 but forked the Java/Scala compilers and those used JDK 7. Or some later repackaging process ran on the artifacts and used Java 6. I do see "Build-Jdk: 1.6.0_45" in the manifest, but I don't think 1.4.x ca

Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Rick Moritz
A quick question regarding this: how come the artifacts (spark-core in particular) on Maven Central are built with JDK 1.6 (according to the manifest), if Java 7 is required? On Aug 21, 2015 5:32 PM, "Sean Owen" wrote: > Spark 1.4 requires Java 7. > > On Fri, Aug 21, 2015, 3:12 PM Chen Song wrot

Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Eric Friedman
Well, this is very strange. My only change is to add -X to make-distribution and it succeeds: % git diff (spark/spark) *diff --git a/make-distribution.sh b/make-distribution.sh* *index a2b0c43..351fac2 100755* *--- a/make-distribution.sh* *+++ b/make-dist

Re: build spark 1.4.1 with JDK 1.6

2015-08-24 Thread Sean Owen
-cdh-user This suggests that Maven is still using Java 6. I think this is indeed controlled by JAVA_HOME. Use 'mvn -X ...' to see a lot more about what is being used and why. I still suspect JAVA_HOME is not visible to the Maven process. Or maybe you have JRE 7 installed but not JDK 7 and it's som

Re: build spark 1.4.1 with JDK 1.6

2015-08-24 Thread Eric Friedman
I'm trying to build Spark 1.4 with Java 7 and despite having that as my JAVA_HOME, I get [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-launcher_2.10 --- [INFO] Using zinc server for incremental compilation [info] Compiling 8 Java sources to /Users/eric/spark/spark/lau

Re: build spark 1.4.1 with JDK 1.6

2015-08-21 Thread Marcelo Vanzin
That was only true until Spark 1.3. Spark 1.4 can be built with JDK7 and pyspark will still work. On Fri, Aug 21, 2015 at 8:29 AM, Chen Song wrote: > Thanks Sean. > > So how PySpark is supported. I thought PySpark needs jdk 1.6. > > Chen > > On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen wrote: >>

Re: build spark 1.4.1 with JDK 1.6

2015-08-21 Thread Sean Owen
Spark 1.4 requires Java 7. On Fri, Aug 21, 2015, 3:12 PM Chen Song wrote: > I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support > PySpark, I used JDK 1.6. > > I got the following error, > > [INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) > @ spark-str

Re: build spark 1.4.1 with JDK 1.6

2015-08-21 Thread Chen Song
Thanks Sean. So how PySpark is supported. I thought PySpark needs jdk 1.6. Chen On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen wrote: > Spark 1.4 requires Java 7. > > On Fri, Aug 21, 2015, 3:12 PM Chen Song wrote: > >> I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support >> PySp

build spark 1.4.1 with JDK 1.6

2015-08-21 Thread Chen Song
I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support PySpark, I used JDK 1.6. I got the following error, [INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) @ spark-streaming_2.10 --- java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritabl