A quick question regarding this: how come the artifacts (spark-core in
particular) on Maven Central are built with JDK 1.6 (according to the
manifest), if Java 7 is required?
On Aug 21, 2015 5:32 PM, "Sean Owen" <so...@cloudera.com> wrote:

> Spark 1.4 requires Java 7.
>
> On Fri, Aug 21, 2015, 3:12 PM Chen Song <chen.song...@gmail.com> wrote:
>
>> I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
>> PySpark, I used JDK 1.6.
>>
>> I got the following error,
>>
>> [INFO] --- scala-maven-plugin:3.2.0:testCompile
>> (scala-test-compile-first) @ spark-streaming_2.10 ---
>>
>> java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable
>> : Unsupported major.minor version 51.0
>> at java.lang.ClassLoader.defineClass1(Native Method)
>> at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
>> at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
>> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>
>> I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
>> Anyone has done this before?
>>
>> Thanks,
>>
>> --
>> Chen Song
>>
>>

Reply via email to