Github user astroshim commented on the issue:

    https://github.com/apache/zeppelin/pull/1318
  
    I got a following error when I try to run zeppelin with spark2.0&hadoop2.7.
    ```
    ERROR [2016-08-16 16:43:33,121] ({pool-1-thread-3} 
Utils.java[invokeMethod]:40) -
    java.lang.reflect.InvocationTargetException
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
            at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
            at 
org.apache.zeppelin.spark.SparkInterpreter.createSparkSession(SparkInterpreter.java:345)
            at 
org.apache.zeppelin.spark.SparkInterpreter.getSparkSession(SparkInterpreter.java:218)
            at 
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:743)
            at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
            at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.getProgress(LazyOpenInterpreter.java:110)
            at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.getProgress(RemoteInterpreterServer.java:447)
            at 
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1701)
            at 
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1686)
            at 
org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
            at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
            at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
            at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
            at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
            at java.lang.Thread.run(Thread.java:745)
    Caused by: java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.hadoop.yarn.conf.YarnConfiguration
            at 
org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.newConfiguration(YarnSparkHadoopUtil.scala:71)
            at 
org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:54)
            at 
org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.<init>(YarnSparkHadoopUtil.scala:56)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
            at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
            at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
            at java.lang.Class.newInstance(Class.java:383)
            at 
org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:414)
            at 
org.apache.spark.deploy.SparkHadoopUtil$.yarn$lzycompute(SparkHadoopUtil.scala:412)
            at 
org.apache.spark.deploy.SparkHadoopUtil$.yarn(SparkHadoopUtil.scala:412)
            at 
org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:437)
            at 
org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:2203)
            at 
org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
            at org.apache.spark.SparkEnv$.create(SparkEnv.scala:320)
            at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:165)
            at 
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:259)
            at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
            at 
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
            at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
            at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
            at scala.Option.getOrElse(Option.scala:121)
            at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
            ... 20 more
    ```
    
    
    My build command is 
    ```
    mvn clean package -Pspark-2.0 -Phadoop-2.7 -Dhadoop.version=2.7.2 -Pyarn 
-Ppyspark -Pscala-2.11 -DskipTests 
    ```
    
    
    but hadoop library for spark interpreter is
    ```
    ~/zeppelin$ ls -al ./spark/target/lib/hadoop-*
    -rw-rw-r-- 1 nflabs nflabs   17385  8월 16 23:52 
./spark/target/lib/hadoop-annotations-2.7.2.jar
    -rw-rw-r-- 1 nflabs nflabs   49750  8월 16 23:52 
./spark/target/lib/hadoop-auth-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs    2559  8월 16 23:52 
./spark/target/lib/hadoop-client-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs 2735584  8월 16 23:52 
./spark/target/lib/hadoop-common-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs 5242252  8월 16 23:52 
./spark/target/lib/hadoop-hdfs-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs  482042  8월 16 23:52 
./spark/target/lib/hadoop-mapreduce-client-app-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs  656365  8월 16 23:52 
./spark/target/lib/hadoop-mapreduce-client-common-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs 1455001  8월 16 23:52 
./spark/target/lib/hadoop-mapreduce-client-core-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs   35216  8월 16 23:52 
./spark/target/lib/hadoop-mapreduce-client-jobclient-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs   21537  8월 16 23:52 
./spark/target/lib/hadoop-mapreduce-client-shuffle-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs 2015575  8월 16 23:52 
./spark/target/lib/hadoop-yarn-api-2.7.2.jar
    -rw-rw-r-- 1 nflabs nflabs   94728  8월 16 23:52 
./spark/target/lib/hadoop-yarn-client-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs 1301627  8월 16 23:52 
./spark/target/lib/hadoop-yarn-common-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs  175554  8월 16 23:52 
./spark/target/lib/hadoop-yarn-server-common-2.2.0.jar
    -rw-rw-r-- 1 nflabs nflabs   25710  8월 16 23:52 
./spark/target/lib/hadoop-yarn-server-web-proxy-2.2.0.jar
    ```
    
    Maybe the error occurs because different versions of the hadoop library.
    I will make an PR for this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to