-cdh-user

This suggests that Maven is still using Java 6. I think this is indeed
controlled by JAVA_HOME. Use 'mvn -X ...' to see a lot more about what
is being used and why. I still suspect JAVA_HOME is not visible to the
Maven process. Or maybe you have JRE 7 installed but not JDK 7 and
it's somehow still finding the Java 6 javac.

On Tue, Aug 25, 2015 at 3:45 AM, Eric Friedman
<eric.d.fried...@gmail.com> wrote:
> I'm trying to build Spark 1.4 with Java 7 and despite having that as my
> JAVA_HOME, I get
>
> [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
> spark-launcher_2.10 ---
>
> [INFO] Using zinc server for incremental compilation
>
> [info] Compiling 8 Java sources to
> /Users/eric/spark/spark/launcher/target/scala-2.10/classes...
>
> [error] javac: invalid source release: 1.7
>
> [error] Usage: javac <options> <source files>
>
> [error] use -help for a list of possible options
>
> [error] Compile failed at Aug 24, 2015 7:44:40 PM [0.020s]
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Reactor Summary:
>
> [INFO]
>
> [INFO] Spark Project Parent POM ........................... SUCCESS [  3.109
> s]
>
> [INFO] Spark Project Launcher ............................. FAILURE [  4.493
> s]
>
>
> On Fri, Aug 21, 2015 at 9:43 AM, Marcelo Vanzin <van...@cloudera.com> wrote:
>>
>> That was only true until Spark 1.3. Spark 1.4 can be built with JDK7
>> and pyspark will still work.
>>
>> On Fri, Aug 21, 2015 at 8:29 AM, Chen Song <chen.song...@gmail.com> wrote:
>> > Thanks Sean.
>> >
>> > So how PySpark is supported. I thought PySpark needs jdk 1.6.
>> >
>> > Chen
>> >
>> > On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen <so...@cloudera.com> wrote:
>> >>
>> >> Spark 1.4 requires Java 7.
>> >>
>> >>
>> >> On Fri, Aug 21, 2015, 3:12 PM Chen Song <chen.song...@gmail.com> wrote:
>> >>>
>> >>> I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
>> >>> PySpark, I used JDK 1.6.
>> >>>
>> >>> I got the following error,
>> >>>
>> >>> [INFO] --- scala-maven-plugin:3.2.0:testCompile
>> >>> (scala-test-compile-first) @ spark-streaming_2.10 ---
>> >>>
>> >>> java.lang.UnsupportedClassVersionError:
>> >>> org/apache/hadoop/io/LongWritable
>> >>> : Unsupported major.minor version 51.0
>> >>> at java.lang.ClassLoader.defineClass1(Native Method)
>> >>> at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
>> >>> at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
>> >>> at
>> >>>
>> >>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> >>>
>> >>> I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
>> >>> Anyone has done this before?
>> >>>
>> >>> Thanks,
>> >>>
>> >>> --
>> >>> Chen Song
>> >>>
>> >
>> >
>> >
>> > --
>> > Chen Song
>> >
>> > --
>> >
>> > ---
>> > You received this message because you are subscribed to the Google
>> > Groups
>> > "CDH Users" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> > an
>> > email to cdh-user+unsubscr...@cloudera.org.
>> > For more options, visit
>> > https://groups.google.com/a/cloudera.org/d/optout.
>>
>>
>>
>> --
>> Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to