Hm... off the cuff I wonder if this is because somehow the build
process ran Maven with Java 6 but forked the Java/Scala compilers and
those used JDK 7. Or some later repackaging process ran on the
artifacts and used Java 6. I do see "Build-Jdk: 1.6.0_45" in the
manifest, but I don't think 1.4.x can compile with Java 6.

On Tue, Aug 25, 2015 at 9:59 PM, Rick Moritz <rah...@gmail.com> wrote:
> A quick question regarding this: how come the artifacts (spark-core in
> particular) on Maven Central are built with JDK 1.6 (according to the
> manifest), if Java 7 is required?
>
> On Aug 21, 2015 5:32 PM, "Sean Owen" <so...@cloudera.com> wrote:
>>
>> Spark 1.4 requires Java 7.
>>
>>
>> On Fri, Aug 21, 2015, 3:12 PM Chen Song <chen.song...@gmail.com> wrote:
>>>
>>> I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
>>> PySpark, I used JDK 1.6.
>>>
>>> I got the following error,
>>>
>>> [INFO] --- scala-maven-plugin:3.2.0:testCompile
>>> (scala-test-compile-first) @ spark-streaming_2.10 ---
>>>
>>> java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable
>>> : Unsupported major.minor version 51.0
>>> at java.lang.ClassLoader.defineClass1(Native Method)
>>> at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
>>> at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>
>>> I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
>>> Anyone has done this before?
>>>
>>> Thanks,
>>>
>>> --
>>> Chen Song
>>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to