Re: Error starting spark interpreter with 0.9.0

2020-06-30 Thread Jeff Zhang
Which spark version do you use ? And could you check the spark interpreter
log file ? It is in ZEPPELIN_HOME/logs/zeppelin-interpreter-spark-*.log

David Boyd  于2020年6月30日周二 下午11:11写道:

> All:
>
> Just trying to get 0.9.0 to work and running into all sorts of issues.
> Previously I had set SPARK_MASTER to be yarn-client   so it would use my
> existing yarn cluster.
> That threw an error about yarn-client being deprecated in 2.0.
> So I switched it to local.
> I now get the error about the interpreter not starting and the following
> output in the note:
>
> > org.apache.zeppelin.interpreter.InterpreterException:
> > java.io.IOException: Fail to launch interpreter process: Interpreter
> > launch command: /opt/spark/spark-current/bin/spark-submit --class
> > org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
> > --driver-class-path
> >
> ":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar
>
> >
> /opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop"
>
> > --driver-java-options " -Dfile.encoding=UTF-8
> >
> -Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties'
>
> >
> -Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.properties'
>
> >
> -Dzeppelin.log.file='/opt/zeppelin/zeppelin-current/logs/zeppelin-interpreter-spark-dspc_demo-zeppelin-dspcnode11.dspc.incadencecorp.com.log'"
>
> > --driver-memory 4G --executor-memory 6G --conf
> > spark\.serializer\=org\.apache\.spark\.serializer\.KryoSerializer
> > --conf spark\.executor\.memory\=1G --conf spark\.app\.name\=Zeppelin
> > --conf spark\.executor\.instances\=5 --conf spark\.master\=local\[\*\]
> > --conf spark\.sql\.crossJoin\.enabled\=true --conf
> > spark\.cores\.max\=10
> >
> /opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar
>
> > 10.1.50.111 33591 "spark-dspc_demo" : SLF4J: Class path contains
> > multiple SLF4J bindings. SLF4J: Found binding in
> >
> [jar:file:/opt/zeppelin/zeppelin-0.9.0-SNAPSHOT/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
>
> > Found binding in
> >
> [jar:file:/opt/spark/spark-2.4.3.bdp-1-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
>
> > See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation. SLF4J: Actual binding is of type
> > [org.slf4j.impl.Log4jLoggerFactory] at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:134)
>
> > at
> >
> org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:281)
>
> > at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:412)
> > at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:72) at
> > org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
> >
> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
>
> > at
> >
> org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:180)
>
> > at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> > at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>
> > at
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>
> > at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
> > at java.lang.Thread.run(Thread.java:748) Caused by:
> > java.io.IOException: Fail to launch interpreter process: Interpreter
> > launch command: /opt/spark/spark-current/bin/spark-submit --class
> > org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
> > --driver-class-path
> >
> ":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar
>
> >
> /opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop"
>
> > --driver-java-options " -Dfile.encoding=UTF-8
> >
> -Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties'
>
> >
> 

Error starting spark interpreter with 0.9.0

2020-06-30 Thread David Boyd

All:

   Just trying to get 0.9.0 to work and running into all sorts of issues.
Previously I had set SPARK_MASTER to be yarn-client   so it would use my
existing yarn cluster.
That threw an error about yarn-client being deprecated in 2.0.
So I switched it to local.
I now get the error about the interpreter not starting and the following 
output in the note:


org.apache.zeppelin.interpreter.InterpreterException: 
java.io.IOException: Fail to launch interpreter process: Interpreter 
launch command: /opt/spark/spark-current/bin/spark-submit --class 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 
--driver-class-path 
":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar 
/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop" 
--driver-java-options " -Dfile.encoding=UTF-8 
-Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties' 
-Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.properties' 
-Dzeppelin.log.file='/opt/zeppelin/zeppelin-current/logs/zeppelin-interpreter-spark-dspc_demo-zeppelin-dspcnode11.dspc.incadencecorp.com.log'" 
--driver-memory 4G --executor-memory 6G --conf 
spark\.serializer\=org\.apache\.spark\.serializer\.KryoSerializer 
--conf spark\.executor\.memory\=1G --conf spark\.app\.name\=Zeppelin 
--conf spark\.executor\.instances\=5 --conf spark\.master\=local\[\*\] 
--conf spark\.sql\.crossJoin\.enabled\=true --conf 
spark\.cores\.max\=10 
/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar 
10.1.50.111 33591 "spark-dspc_demo" : SLF4J: Class path contains 
multiple SLF4J bindings. SLF4J: Found binding in 
[jar:file:/opt/zeppelin/zeppelin-0.9.0-SNAPSHOT/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: 
Found binding in 
[jar:file:/opt/spark/spark-2.4.3.bdp-1-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: 
See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation. SLF4J: Actual binding is of type 
[org.slf4j.impl.Log4jLoggerFactory] at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:134) 
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:281) 
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:412) 
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:72) at 
org.apache.zeppelin.scheduler.Job.run(Job.java:172) at 
org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130) 
at 
org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:180) 
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) 
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) 
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
at java.lang.Thread.run(Thread.java:748) Caused by: 
java.io.IOException: Fail to launch interpreter process: Interpreter 
launch command: /opt/spark/spark-current/bin/spark-submit --class 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 
--driver-class-path 
":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar 
/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop" 
--driver-java-options " -Dfile.encoding=UTF-8 
-Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties' 
-Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.properties' 
-Dzeppelin.log.file='/opt/zeppelin/zeppelin-current/logs/zeppelin-interpreter-spark-dspc_demo-zeppelin-dspcnode11.dspc.incadencecorp.com.log'" 
--driver-memory 4G --executor-memory 6G --conf 
spark\.serializer\=org\.apache\.spark\.serializer\.KryoSerializer 
--conf spark\.executor\.memory\=1G --conf spark\.app\.name\=Zeppelin 
--conf spark\.executor\.instances\=5 --conf spark\.master\=local\[\*\] 
--conf