Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Eric Friedman
Well, this is very strange.  My only change is to add -X to
make-distribution and it succeeds:

% git diff
  (spark/spark)

*diff --git a/make-distribution.sh b/make-distribution.sh*

*index a2b0c43..351fac2 100755*

*--- a/make-distribution.sh*

*+++ b/make-distribution.sh*

@@ -183,7 +183,7 @@ export MAVEN_OPTS=-Xmx2g -XX:MaxPermSize=512M
-XX:ReservedCodeCacheSize=512m

 # Store the command as an array because $MVN variable might have spaces in
it.

 # Normal quoting tricks don't work.

 # See: http://mywiki.wooledge.org/BashFAQ/050

-BUILD_COMMAND=($MVN clean package -DskipTests $@)

+BUILD_COMMAND=($MVN -X clean package -DskipTests $@)



 # Actually build the jar

 echo -e \nBuilding with...


export JAVA_HOME=/Library/Java/Home

% which javac


/usr/bin/javac

% javac -version


javac 1.7.0_79




On Mon, Aug 24, 2015 at 11:30 PM, Sean Owen so...@cloudera.com wrote:

 -cdh-user

 This suggests that Maven is still using Java 6. I think this is indeed
 controlled by JAVA_HOME. Use 'mvn -X ...' to see a lot more about what
 is being used and why. I still suspect JAVA_HOME is not visible to the
 Maven process. Or maybe you have JRE 7 installed but not JDK 7 and
 it's somehow still finding the Java 6 javac.

 On Tue, Aug 25, 2015 at 3:45 AM, Eric Friedman
 eric.d.fried...@gmail.com wrote:
  I'm trying to build Spark 1.4 with Java 7 and despite having that as my
  JAVA_HOME, I get
 
  [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
  spark-launcher_2.10 ---
 
  [INFO] Using zinc server for incremental compilation
 
  [info] Compiling 8 Java sources to
  /Users/eric/spark/spark/launcher/target/scala-2.10/classes...
 
  [error] javac: invalid source release: 1.7
 
  [error] Usage: javac options source files
 
  [error] use -help for a list of possible options
 
  [error] Compile failed at Aug 24, 2015 7:44:40 PM [0.020s]
 
  [INFO]
  
 
  [INFO] Reactor Summary:
 
  [INFO]
 
  [INFO] Spark Project Parent POM ... SUCCESS [
 3.109
  s]
 
  [INFO] Spark Project Launcher . FAILURE [
 4.493
  s]
 
 
  On Fri, Aug 21, 2015 at 9:43 AM, Marcelo Vanzin van...@cloudera.com
 wrote:
 
  That was only true until Spark 1.3. Spark 1.4 can be built with JDK7
  and pyspark will still work.
 
  On Fri, Aug 21, 2015 at 8:29 AM, Chen Song chen.song...@gmail.com
 wrote:
   Thanks Sean.
  
   So how PySpark is supported. I thought PySpark needs jdk 1.6.
  
   Chen
  
   On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen so...@cloudera.com
 wrote:
  
   Spark 1.4 requires Java 7.
  
  
   On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com
 wrote:
  
   I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to
 support
   PySpark, I used JDK 1.6.
  
   I got the following error,
  
   [INFO] --- scala-maven-plugin:3.2.0:testCompile
   (scala-test-compile-first) @ spark-streaming_2.10 ---
  
   java.lang.UnsupportedClassVersionError:
   org/apache/hadoop/io/LongWritable
   : Unsupported major.minor version 51.0
   at java.lang.ClassLoader.defineClass1(Native Method)
   at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
   at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
   at
  
  
 java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
  
   I know that is due to the hadoop jar for cdh5.4.0 is built with JDK
 7.
   Anyone has done this before?
  
   Thanks,
  
   --
   Chen Song
  
  
  
  
   --
   Chen Song
  
   --
  
   ---
   You received this message because you are subscribed to the Google
   Groups
   CDH Users group.
   To unsubscribe from this group and stop receiving emails from it, send
   an
   email to cdh-user+unsubscr...@cloudera.org.
   For more options, visit
   https://groups.google.com/a/cloudera.org/d/optout.
 
 
 
  --
  Marcelo
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 
 



Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Rick Moritz
A quick question regarding this: how come the artifacts (spark-core in
particular) on Maven Central are built with JDK 1.6 (according to the
manifest), if Java 7 is required?
On Aug 21, 2015 5:32 PM, Sean Owen so...@cloudera.com wrote:

 Spark 1.4 requires Java 7.

 On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com wrote:

 I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
 PySpark, I used JDK 1.6.

 I got the following error,

 [INFO] --- scala-maven-plugin:3.2.0:testCompile
 (scala-test-compile-first) @ spark-streaming_2.10 ---

 java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable
 : Unsupported major.minor version 51.0
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
 at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)

 I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
 Anyone has done this before?

 Thanks,

 --
 Chen Song




Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Sean Owen
-cdh-user

This suggests that Maven is still using Java 6. I think this is indeed
controlled by JAVA_HOME. Use 'mvn -X ...' to see a lot more about what
is being used and why. I still suspect JAVA_HOME is not visible to the
Maven process. Or maybe you have JRE 7 installed but not JDK 7 and
it's somehow still finding the Java 6 javac.

On Tue, Aug 25, 2015 at 3:45 AM, Eric Friedman
eric.d.fried...@gmail.com wrote:
 I'm trying to build Spark 1.4 with Java 7 and despite having that as my
 JAVA_HOME, I get

 [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
 spark-launcher_2.10 ---

 [INFO] Using zinc server for incremental compilation

 [info] Compiling 8 Java sources to
 /Users/eric/spark/spark/launcher/target/scala-2.10/classes...

 [error] javac: invalid source release: 1.7

 [error] Usage: javac options source files

 [error] use -help for a list of possible options

 [error] Compile failed at Aug 24, 2015 7:44:40 PM [0.020s]

 [INFO]
 

 [INFO] Reactor Summary:

 [INFO]

 [INFO] Spark Project Parent POM ... SUCCESS [  3.109
 s]

 [INFO] Spark Project Launcher . FAILURE [  4.493
 s]


 On Fri, Aug 21, 2015 at 9:43 AM, Marcelo Vanzin van...@cloudera.com wrote:

 That was only true until Spark 1.3. Spark 1.4 can be built with JDK7
 and pyspark will still work.

 On Fri, Aug 21, 2015 at 8:29 AM, Chen Song chen.song...@gmail.com wrote:
  Thanks Sean.
 
  So how PySpark is supported. I thought PySpark needs jdk 1.6.
 
  Chen
 
  On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen so...@cloudera.com wrote:
 
  Spark 1.4 requires Java 7.
 
 
  On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com wrote:
 
  I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
  PySpark, I used JDK 1.6.
 
  I got the following error,
 
  [INFO] --- scala-maven-plugin:3.2.0:testCompile
  (scala-test-compile-first) @ spark-streaming_2.10 ---
 
  java.lang.UnsupportedClassVersionError:
  org/apache/hadoop/io/LongWritable
  : Unsupported major.minor version 51.0
  at java.lang.ClassLoader.defineClass1(Native Method)
  at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
  at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
  at
 
  java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
 
  I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
  Anyone has done this before?
 
  Thanks,
 
  --
  Chen Song
 
 
 
 
  --
  Chen Song
 
  --
 
  ---
  You received this message because you are subscribed to the Google
  Groups
  CDH Users group.
  To unsubscribe from this group and stop receiving emails from it, send
  an
  email to cdh-user+unsubscr...@cloudera.org.
  For more options, visit
  https://groups.google.com/a/cloudera.org/d/optout.



 --
 Marcelo

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org



-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Sean Owen
Hm... off the cuff I wonder if this is because somehow the build
process ran Maven with Java 6 but forked the Java/Scala compilers and
those used JDK 7. Or some later repackaging process ran on the
artifacts and used Java 6. I do see Build-Jdk: 1.6.0_45 in the
manifest, but I don't think 1.4.x can compile with Java 6.

On Tue, Aug 25, 2015 at 9:59 PM, Rick Moritz rah...@gmail.com wrote:
 A quick question regarding this: how come the artifacts (spark-core in
 particular) on Maven Central are built with JDK 1.6 (according to the
 manifest), if Java 7 is required?

 On Aug 21, 2015 5:32 PM, Sean Owen so...@cloudera.com wrote:

 Spark 1.4 requires Java 7.


 On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com wrote:

 I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
 PySpark, I used JDK 1.6.

 I got the following error,

 [INFO] --- scala-maven-plugin:3.2.0:testCompile
 (scala-test-compile-first) @ spark-streaming_2.10 ---

 java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable
 : Unsupported major.minor version 51.0
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
 at
 java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)

 I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
 Anyone has done this before?

 Thanks,

 --
 Chen Song



-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: build spark 1.4.1 with JDK 1.6

2015-08-25 Thread Rick Moritz
My local build using rc-4 and java 7 does actually also produce different
binaries (for one file only) than the 1.4.0 releqse artifact available on
Central. These binaries also decompile to identical instructions, but this
may be due to different versions of javac (within the 7 family) producing
different output, rather than a major-version difference between build
environments.
On Aug 25, 2015 11:10 PM, Sean Owen so...@cloudera.com wrote:

 Hm... off the cuff I wonder if this is because somehow the build
 process ran Maven with Java 6 but forked the Java/Scala compilers and
 those used JDK 7. Or some later repackaging process ran on the
 artifacts and used Java 6. I do see Build-Jdk: 1.6.0_45 in the
 manifest, but I don't think 1.4.x can compile with Java 6.

 On Tue, Aug 25, 2015 at 9:59 PM, Rick Moritz rah...@gmail.com wrote:
  A quick question regarding this: how come the artifacts (spark-core in
  particular) on Maven Central are built with JDK 1.6 (according to the
  manifest), if Java 7 is required?
 
  On Aug 21, 2015 5:32 PM, Sean Owen so...@cloudera.com wrote:
 
  Spark 1.4 requires Java 7.
 
 
  On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com wrote:
 
  I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
  PySpark, I used JDK 1.6.
 
  I got the following error,
 
  [INFO] --- scala-maven-plugin:3.2.0:testCompile
  (scala-test-compile-first) @ spark-streaming_2.10 ---
 
  java.lang.UnsupportedClassVersionError:
 org/apache/hadoop/io/LongWritable
  : Unsupported major.minor version 51.0
  at java.lang.ClassLoader.defineClass1(Native Method)
  at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
  at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
  at
  java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
 
  I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
  Anyone has done this before?
 
  Thanks,
 
  --
  Chen Song
 
 



Re: build spark 1.4.1 with JDK 1.6

2015-08-24 Thread Eric Friedman
I'm trying to build Spark 1.4 with Java 7 and despite having that as my
JAVA_HOME, I get

[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
spark-launcher_2.10 ---

[INFO] Using zinc server for incremental compilation

[info] Compiling 8 Java sources to
/Users/eric/spark/spark/launcher/target/scala-2.10/classes...

[error] javac: invalid source release: 1.7

[error] Usage: javac options source files

[error] use -help for a list of possible options

[error] Compile failed at Aug 24, 2015 7:44:40 PM [0.020s]

[INFO]


[INFO] Reactor Summary:

[INFO]

[INFO] Spark Project Parent POM ... SUCCESS [
3.109 s]

[INFO] Spark Project Launcher . FAILURE [
4.493 s]

On Fri, Aug 21, 2015 at 9:43 AM, Marcelo Vanzin van...@cloudera.com wrote:

 That was only true until Spark 1.3. Spark 1.4 can be built with JDK7
 and pyspark will still work.

 On Fri, Aug 21, 2015 at 8:29 AM, Chen Song chen.song...@gmail.com wrote:
  Thanks Sean.
 
  So how PySpark is supported. I thought PySpark needs jdk 1.6.
 
  Chen
 
  On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen so...@cloudera.com wrote:
 
  Spark 1.4 requires Java 7.
 
 
  On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com wrote:
 
  I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
  PySpark, I used JDK 1.6.
 
  I got the following error,
 
  [INFO] --- scala-maven-plugin:3.2.0:testCompile
  (scala-test-compile-first) @ spark-streaming_2.10 ---
 
  java.lang.UnsupportedClassVersionError:
 org/apache/hadoop/io/LongWritable
  : Unsupported major.minor version 51.0
  at java.lang.ClassLoader.defineClass1(Native Method)
  at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
  at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
  at
  java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
 
  I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
  Anyone has done this before?
 
  Thanks,
 
  --
  Chen Song
 
 
 
 
  --
  Chen Song
 
  --
 
  ---
  You received this message because you are subscribed to the Google Groups
  CDH Users group.
  To unsubscribe from this group and stop receiving emails from it, send an
  email to cdh-user+unsubscr...@cloudera.org.
  For more options, visit
 https://groups.google.com/a/cloudera.org/d/optout.



 --
 Marcelo

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: build spark 1.4.1 with JDK 1.6

2015-08-21 Thread Chen Song
Thanks Sean.

So how PySpark is supported. I thought PySpark needs jdk 1.6.

Chen

On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen so...@cloudera.com wrote:

 Spark 1.4 requires Java 7.

 On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com wrote:

 I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
 PySpark, I used JDK 1.6.

 I got the following error,

 [INFO] --- scala-maven-plugin:3.2.0:testCompile
 (scala-test-compile-first) @ spark-streaming_2.10 ---

 java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable
 : Unsupported major.minor version 51.0
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
 at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)

 I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
 Anyone has done this before?

 Thanks,

 --
 Chen Song




-- 
Chen Song


Re: build spark 1.4.1 with JDK 1.6

2015-08-21 Thread Marcelo Vanzin
That was only true until Spark 1.3. Spark 1.4 can be built with JDK7
and pyspark will still work.

On Fri, Aug 21, 2015 at 8:29 AM, Chen Song chen.song...@gmail.com wrote:
 Thanks Sean.

 So how PySpark is supported. I thought PySpark needs jdk 1.6.

 Chen

 On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen so...@cloudera.com wrote:

 Spark 1.4 requires Java 7.


 On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com wrote:

 I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
 PySpark, I used JDK 1.6.

 I got the following error,

 [INFO] --- scala-maven-plugin:3.2.0:testCompile
 (scala-test-compile-first) @ spark-streaming_2.10 ---

 java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable
 : Unsupported major.minor version 51.0
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
 at
 java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)

 I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
 Anyone has done this before?

 Thanks,

 --
 Chen Song




 --
 Chen Song

 --

 ---
 You received this message because you are subscribed to the Google Groups
 CDH Users group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to cdh-user+unsubscr...@cloudera.org.
 For more options, visit https://groups.google.com/a/cloudera.org/d/optout.



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



build spark 1.4.1 with JDK 1.6

2015-08-21 Thread Chen Song
I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
PySpark, I used JDK 1.6.

I got the following error,

[INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first)
@ spark-streaming_2.10 ---

java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable :
Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)

I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
Anyone has done this before?

Thanks,

-- 
Chen Song


Re: build spark 1.4.1 with JDK 1.6

2015-08-21 Thread Sean Owen
Spark 1.4 requires Java 7.

On Fri, Aug 21, 2015, 3:12 PM Chen Song chen.song...@gmail.com wrote:

 I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
 PySpark, I used JDK 1.6.

 I got the following error,

 [INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first)
 @ spark-streaming_2.10 ---

 java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable
 : Unsupported major.minor version 51.0
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
 at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)

 I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
 Anyone has done this before?

 Thanks,

 --
 Chen Song