Anyone knows how to build and spark on jdk9?

2017-10-26 Thread Zhang, Liyun
Hi all:
1.   I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I 
search for jiras related to JDK9. I only found 
SPARK-13278<https://issues.apache.org/jira/browse/SPARK-13278>.  This means now 
spark can build or run successfully on JDK9 ?


Best Regards
Kelly Zhang/Zhang,Liyun



RE: Anyone knows how to build and spark on jdk9?

2017-10-26 Thread Zhang, Liyun
Thanks your suggestion, seems that scala 2.12.4 support jdk9


Scala 2.12.4<https://github.com/scala/scala/releases/tag/v2.12.4> is now 
available.

Our 
benchmarks<https://scala-ci.typesafe.com/grafana/dashboard/db/scala-benchmark?var-branch=2.12.x&from=1501580691158&to=1507711932006>
 show a further reduction in compile times since 2.12.3 of 5-10%.

Improved Java 9 friendliness, with more to come!

Best Regards
Kelly Zhang/Zhang,Liyun





From: Reynold Xin [mailto:r...@databricks.com]
Sent: Friday, October 27, 2017 10:26 AM
To: Zhang, Liyun ; dev@spark.apache.org; 
u...@spark.apache.org
Subject: Re: Anyone knows how to build and spark on jdk9?

It probably depends on the Scala version we use in Spark supporting Java 9 
first.

On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun 
mailto:liyun.zh...@intel.com>> wrote:
Hi all:
1.   I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I 
search for jiras related to JDK9. I only found 
SPARK-13278<https://issues.apache.org/jira/browse/SPARK-13278>.  This means now 
spark can build or run successfully on JDK9 ?


Best Regards
Kelly Zhang/Zhang,Liyun



Does anyone know how to build spark with scala12.4?

2017-11-27 Thread Zhang, Liyun
Hi all:
  Does anyone know how to build spark with scala12.4? I want to test whether 
spark can work on jdk9 or not.  Scala12.4 supports jdk9.  Does anyone try to 
build spark with scala 12.4 or compile successfully with jdk9.Appreciate to get 
some feedback from you.


Best Regards
Kelly Zhang/Zhang,Liyun



RE: Does anyone know how to build spark with scala12.4?

2017-11-29 Thread Zhang, Liyun
Hi Sean:
  I have tried to use following script to build package but have problem( I am 
building a spark package for Hive on Spark, so use hadoop2-without-hive)

./dev/make-distribution.sh  --name hadoop2-without-hive --tgz -Pscala-2.12 
-Phadoop-2.7  -Pyarn -Pparquet-provided -Dhadoop.version=2.7.3


The problem is

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal 
on project spark-sketch_2.11: Could not resolve dependencies for project 
org.apache.spa  rk:spark-sketch_2.11:jar:2.3.0-SNAPSHOT: The following 
artifacts could not be resolved: 
org.apache.spark:spark-tags_2.12:jar:2.3.0-SNAPSHOT, org.apache.spark:spark-ta  
gs_2.12:jar:tests:2.3.0-SNAPSHOT: Failure to find 
org.apache.spark:spark-tags_2.12:jar:2.3.0-SNAPSHOT in 
https://repository.apache.org/snapshots was cached in the loc  al 
repository,



After then I found that the artifactId in $SPARK_SOURCE/common/tags/pom.xml  is 
spark-tags_2.11. I change the $SPARK_SOURCE/common/tags/pom.xml  as following, 
the problem about {{.spark:spark-tags_2.12:jar:2.3.0-SNAPSHOT}}  seems not exist

git diff common/tags/pom.xml
diff --git a/common/tags/pom.xml b/common/tags/pom.xml
index f7e586e..5f48105 100644
--- a/common/tags/pom.xml
+++ b/common/tags/pom.xml
@@ -26,7 +26,8 @@
 ../../pom.xml
   
-  spark-tags_2.11
+  
+spark-tags_2.12
   jar
   Spark Project Tags
   http://spark.apache.org/http://spark.apache.org/%3c/url>>


My question is

1.   Should I change the $SPARK_SOURCE/common/tags/pom.xml  manually to 
spark-tags_2.12, if need, I guess I need to change {{spark-streaming}}





Appreciate to get some feedback from you.




Best Regards
Kelly Zhang/Zhang,Liyun







From: Sean Owen [mailto:so...@cloudera.com]
Sent: Tuesday, November 28, 2017 9:52 PM
To: Ofir Manor 
Cc: Zhang, Liyun ; dev 
Subject: Re: Does anyone know how to build spark with scala12.4?

The Scala 2.12 profile mostly works, but not all tests pass. Use -Pscala-2.12 
on the command line to build.

On Tue, Nov 28, 2017 at 5:36 AM Ofir Manor 
mailto:ofir.ma...@equalum.io>> wrote:
Hi,
as far as I know, Spark does not support Scala 2.12.
There is on-going work to make refactor / fix Spark source code to support 
Scala 2.12 - look for multiple emails on this list in the last months from Sean 
Owen on his progress.
Once Spark supports Scala 2.12, I think the next target would be JDK 9 support.


Ofir Manor

Co-Founder & CTO | Equalum

Mobile: +972-54-7801286 | Email: 
ofir.ma...@equalum.io<mailto:ofir.ma...@equalum.io>

On Tue, Nov 28, 2017 at 9:20 AM, Zhang, Liyun 
mailto:liyun.zh...@intel.com>> wrote:
Hi all:
  Does anyone know how to build spark with scala12.4? I want to test whether 
spark can work on jdk9 or not.  Scala12.4 supports jdk9.  Does anyone try to 
build spark with scala 12.4 or compile successfully with jdk9.Appreciate to get 
some feedback from you.


Best Regards
Kelly Zhang/Zhang,Liyun




Anyone know how to bypass tools.jar problem in JDK9 when mvn clean install SPARK code

2017-12-21 Thread Zhang, Liyun
Hi all:
  Now I am using JDK9 to compile Spark by (mvn clean install -DskipTests), but 
exception was thrown

[root@bdpe41 spark_source]# java -version
java version "9.0.1"
Java(TM) SE Runtime Environment (build 9.0.1+11)
Java HotSpot(TM) 64-Bit Server VM (build 9.0.1+11, mixed mode)

#mvn clean install -Pscala-2.12  -Pyarn -Pparquet-provided -DskipTests  
-X>log.mvn.clean.installed 2>&1

654189 [INFO] Spark Project Hive . SUCCESS [ 
30.708 s]
654190 [INFO] Spark Project REPL . SUCCESS [  
2.795 s]
654191 [INFO] Spark Project YARN Shuffle Service . SUCCESS [  
6.411 s]
654192 [INFO] Spark Project YARN . FAILURE [  
0.047 s]
654193 [INFO] Spark Project Assembly . SKIPPED
654194 [INFO] Spark Integration for Kafka 0.10 ... SKIPPED
654195 [INFO] Kafka 0.10 Source for Structured Streaming . SKIPPED
654196 [INFO] Spark Project Examples . SKIPPED
654197 [INFO] Spark Integration for Kafka 0.10 Assembly .. SKIPPED
654198 [INFO] 

654199 [INFO] BUILD FAILURE
654200 [INFO] 

654201 [INFO] Total time: 07:04 min
654202 [INFO] Finished at: 2017-12-21T03:38:40+08:00
654203 [INFO] Final Memory: 85M/280M
654204 [INFO] 

654205 [ERROR] Failed to execute goal on project spark-yarn_2.12: Could not 
resolve dependencies for project 
org.apache.spark:spark-yarn_2.12:jar:2.3.0-SNAPS   HOT: Could not find 
artifact jdk.tools:jdk.tools:jar:1.6 at specified path 
/home/zly/prj/oss/jdk-9.0.1/../lib/tools.jar -> [Help 1]
654206 org.apache.maven.lifecycle.LifecycleExecutionException: Failed to 
execute goal on project spark-yarn_2.12: Could not resolve dependencies for 
projectorg.apache.spark:spark-yarn_2.12:jar:2.3.0-SNAPSHOT: Could not 
find artifact jdk.tools:jdk.tools:jar:1.6 at specified path 
/home/zly/prj/oss/jdk-9.0.1   /../lib/tools.jar


There is no tools.jar in 
JDK9<http://blog.codefx.org/java/dev/how-java-9-and-project-jigsaw-may-break-your-code/#Internal-JARs-Become-Unavailable>.
 I need to generate spark 2.3-SNAPSHOT in my local mvn repository to build 
other component(Hive on Spark) .  Anyone knows how to bypass this problem?




Best Regards
Kelly Zhang/Zhang,Liyun



FW: Anyone know how to bypass tools.jar problem in JDK9 when mvn clean install SPARK code

2017-12-21 Thread Zhang, Liyun
Hi  Sean:

Thanks for your reply,
You mentioned that “You need to run ./dev/change-scala-version.sh 2.12 first” I 
have done that but still have this problem,   currently the problem is about 
tools.jar does not exist in JDK9 but some dependencies on it when using “mvn 
clean install –DskipTests”.



Best Regards
Kelly Zhang/Zhang,Liyun


From: Sean Owen [mailto:so...@cloudera.com]
Sent: Friday, December 22, 2017 6:56 AM
To: Zhang, Liyun 
Cc: dev@spark.apache.org
Subject: Re: Anyone know how to bypass tools.jar problem in JDK9 when mvn clean 
install SPARK code

You need to run ./dev/change-scala-version.sh 2.12 first

On Thu, Dec 21, 2017 at 4:38 PM Zhang, Liyun 
mailto:liyun.zh...@intel.com>> wrote:
Hi all:
  Now I am using JDK9 to compile Spark by (mvn clean install –DskipTests), but 
exception was thrown

[root@bdpe41 spark_source]# java -version
java version "9.0.1"
Java(TM) SE Runtime Environment (build 9.0.1+11)
Java HotSpot(TM) 64-Bit Server VM (build 9.0.1+11, mixed mode)

#mvn clean install -Pscala-2.12  -Pyarn -Pparquet-provided -DskipTests  
-X>log.mvn.clean.installed 2>&1

654189 [INFO] Spark Project Hive . SUCCESS [ 
30.708 s]
654190 [INFO] Spark Project REPL . SUCCESS [  
2.795 s]
654191 [INFO] Spark Project YARN Shuffle Service . SUCCESS [  
6.411 s]
654192 [INFO] Spark Project YARN . FAILURE [  
0.047 s]
654193 [INFO] Spark Project Assembly . SKIPPED
654194 [INFO] Spark Integration for Kafka 0.10 ... SKIPPED
654195 [INFO] Kafka 0.10 Source for Structured Streaming . SKIPPED
654196 [INFO] Spark Project Examples . SKIPPED
654197 [INFO] Spark Integration for Kafka 0.10 Assembly .. SKIPPED
654198 [INFO] 

654199 [INFO] BUILD FAILURE
654200 [INFO] 

654201 [INFO] Total time: 07:04 min
654202 [INFO] Finished at: 2017-12-21T03:38:40+08:00
654203 [INFO] Final Memory: 85M/280M
654204 [INFO] 

654205 [ERROR] Failed to execute goal on project spark-yarn_2.12: Could not 
resolve dependencies for project 
org.apache.spark:spark-yarn_2.12:jar:2.3.0-SNAPS   HOT: Could not find 
artifact jdk.tools:jdk.tools:jar:1.6 at specified path 
/home/zly/prj/oss/jdk-9.0.1/../lib/tools.jar -> [Help 1]
654206 org.apache.maven.lifecycle.LifecycleExecutionException: Failed to 
execute goal on project spark-yarn_2.12: Could not resolve dependencies for 
projectorg.apache.spark:spark-yarn_2.12:jar:2.3.0-SNAPSHOT: Could not 
find artifact jdk.tools:jdk.tools:jar:1.6 at specified path 
/home/zly/prj/oss/jdk-9.0.1   /../lib/tools.noite

There is no tools.jar in 
JDK9<http://blog.codefx.org/java/dev/how-java-9-and-project-jigsaw-may-break-your-code/#Internal-JARs-Become-Unavailable>.
 I need to generate spark 2.3-SNAPSHOT in my local mvn repository to build 
other component(Hive on Spark) .  Anyone knows how to bypass this problem?




Best Regards
Kelly Zhang/Zhang,Liyun



is there any api in spark like getInstance(className:String):AnyRef

2015-03-11 Thread Zhang, Liyun
Hi all:
  I'm a newbie to spark and scala and now I am working on 
SPARK-5682<https://issues.apache.org/jira/browse/SPARK-5682>(Add encrypted 
shuffle in spark). I met a problem:is there any api in spark like 
getInstance(className:String):AnyRef ? I saw org.apache.spark.sql.hive
.thriftserver.ReflectionUtils.scala, but not provide getInstance function in it.

Now i only can implement this function by following code:
object ReflectionUtils1 {
  import scala.reflect.runtime.universe
  abstract case class CryptoCodec() {

  }

  class JceAesCtrCryptoCodec extends CryptoCodec {

  }

  class OpensslAesCtrCryptoCodec extends CryptoCodec {

  }

  def main(args: Array[String]) = {
val className:String =  "JceAesCtrCryptoCodec"
val obj = getInstance(className)
val codec:CryptoCodec = obj.asInstanceOf[CryptoCodec]
println(codec)
  }

  def getInstance(className:String):AnyRef={
val m = universe.runtimeMirror(getClass.getClassLoader)
var c: CryptoCodec = null
if (className.equals("JceAesCtrCryptoCodec")) {
  val classCryptoCodec = universe.typeOf[JceAesCtrCryptoCodec]
.typeSymbol.asClass
  val cm = m.reflectClass(classCryptoCodec)
  val ctor = universe.typeOf[JceAesCtrCryptoCodec].declaration(
universe.nme.CONSTRUCTOR).asMethod
  val ctorm = cm.reflectConstructor(ctor)
  val p = ctorm()
  c = p.asInstanceOf[CryptoCodec]
} else {
  val classCryptoCodec = universe.typeOf[OpensslAesCtrCryptoCodec]
.typeSymbol.asClass
  val cm = m.reflectClass(classCryptoCodec)
  val ctor = universe.typeOf[OpensslAesCtrCryptoCodec].declaration(
universe.nme.CONSTRUCTOR).asMethod
  val ctorm = cm.reflectConstructor(ctor)
  val p = ctorm()
  c = p.asInstanceOf[CryptoCodec]
}
   c
  }
}


in my getInstance(className:String), i judge classname with 
"JceAesCtrCryptoCodec" and
"OpensslAesCtrCryptoCodec" and if the name equals "JceAesCtrCryptoCodec", it 
creates the instance by scala.reflect.runtime.universe api. The code can be 
better like following way but I do not know how to write it:
   def getInstance1(className:String):AnyRef={
   val m = universe.runtimeMirror(getClass.getClassLoader)
   var classLoader: ClassLoader = Thread.currentThread.getContextClassLoader
   val aClass:Class[_] =   Class.forName(className, true, classLoader)
   val aType: scala.reflect.api.TypeTags.TypeTag =  // how to write this 
line?
   val classCryptoCodec = universe.typeOf[aType]
 .typeSymbol.asClass
   val cm = m.reflectClass(classCryptoCodec)
   val ctor = universe.typeOf[aType].declaration(
 universe.nme.CONSTRUCTOR).asMethod
   val ctorm = cm.reflectConstructor(ctor)
   val p = ctorm()
   p
 }

Guidance/advice appreciated!



Best regards
Kelly Zhang/Zhang,Liyun