My local build using rc-4 and java 7 does actually also produce different
binaries (for one file only) than the 1.4.0 releqse artifact available on
Central. These binaries also decompile to identical instructions, but this
may be due to different versions of javac (within the 7 family) producing
di
Hm... off the cuff I wonder if this is because somehow the build
process ran Maven with Java 6 but forked the Java/Scala compilers and
those used JDK 7. Or some later repackaging process ran on the
artifacts and used Java 6. I do see "Build-Jdk: 1.6.0_45" in the
manifest, but I don't think 1.4.x ca
A quick question regarding this: how come the artifacts (spark-core in
particular) on Maven Central are built with JDK 1.6 (according to the
manifest), if Java 7 is required?
On Aug 21, 2015 5:32 PM, "Sean Owen" wrote:
> Spark 1.4 requires Java 7.
>
> On Fri, Aug 21, 2015, 3:12 PM Chen Song wrot
Well, this is very strange. My only change is to add -X to
make-distribution and it succeeds:
% git diff
(spark/spark)
*diff --git a/make-distribution.sh b/make-distribution.sh*
*index a2b0c43..351fac2 100755*
*--- a/make-distribution.sh*
*+++ b/make-dist
-cdh-user
This suggests that Maven is still using Java 6. I think this is indeed
controlled by JAVA_HOME. Use 'mvn -X ...' to see a lot more about what
is being used and why. I still suspect JAVA_HOME is not visible to the
Maven process. Or maybe you have JRE 7 installed but not JDK 7 and
it's som
I'm trying to build Spark 1.4 with Java 7 and despite having that as my
JAVA_HOME, I get
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
spark-launcher_2.10 ---
[INFO] Using zinc server for incremental compilation
[info] Compiling 8 Java sources to
/Users/eric/spark/spark/lau
That was only true until Spark 1.3. Spark 1.4 can be built with JDK7
and pyspark will still work.
On Fri, Aug 21, 2015 at 8:29 AM, Chen Song wrote:
> Thanks Sean.
>
> So how PySpark is supported. I thought PySpark needs jdk 1.6.
>
> Chen
>
> On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen wrote:
>>
Spark 1.4 requires Java 7.
On Fri, Aug 21, 2015, 3:12 PM Chen Song wrote:
> I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
> PySpark, I used JDK 1.6.
>
> I got the following error,
>
> [INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first)
> @ spark-str
Thanks Sean.
So how PySpark is supported. I thought PySpark needs jdk 1.6.
Chen
On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen wrote:
> Spark 1.4 requires Java 7.
>
> On Fri, Aug 21, 2015, 3:12 PM Chen Song wrote:
>
>> I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
>> PySp
I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
PySpark, I used JDK 1.6.
I got the following error,
[INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first)
@ spark-streaming_2.10 ---
java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritabl
10 matches
Mail list logo