Hmm possibly with the classpath. These might be Windows specific issues. We 
probably need to debug to fix these.


________________________________
From: Jan Botorek <jan.boto...@infor.com>
Sent: Tuesday, November 29, 2016 4:01:43 AM
To: users@zeppelin.apache.org
Subject: RE: Unable to connect with Spark Interpreter

Your last advice helped me to progress a little bit:

-          I started spark interpreter manually

o   c:\zepp\\bin\interpreter.cmd, -d, c:\zepp\interpreter\spark\, -p, 61176, 
-l, c:\zepp\/local-repo/2C2ZNEH5W

o   I needed to add a ‚\‘ into the –d attributte and make the path shorter --> 
moved to c:\zepp

-          Then, in Zeppelin web environment I setup the spark interpret to 
„connect to existing process“ (localhost/61176)

-          After that, when I execute any command, in interpreter cmd window 
appears this exception:

o   Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError: 
scala/Option

o           at java.lang.Class.forName0(Native Method)

o           at java.lang.Class.forName(Class.java:264)

o           at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.createInterpreter(RemoteInterpreterServer.java:148)

o           at 
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:1409)

o           at 
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:1394)

o           at 
org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)

o           at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)

o           at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)

o           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

o           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

o           at java.lang.Thread.run(Thread.java:745)

o   Caused by: java.lang.ClassNotFoundException: scala.Option

o           at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

o           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

o           at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

o           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

o           ... 11 more

Is this of any help, please?

Regards,
Jan



From: Jan Botorek [mailto:jan.boto...@infor.com]
Sent: Tuesday, November 29, 2016 12:13 PM
To: users@zeppelin.apache.org
Subject: RE: Unable to connect with Spark Interpreter

I am sorry, but the directory local-repo is not presented in the zeppelin 
folder. I use this (https://zeppelin.apache.org/download.html) newest binary 
version.

Unfortunately, in the 0.6 version downloaded and built from github, also the 
folder local-repo doesn’t exist


From: Jeff Zhang [mailto:zjf...@gmail.com]
Sent: Tuesday, November 29, 2016 10:45 AM
To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: Re: Unable to connect with Spark Interpreter

I still don't see much useful info. Could you try run the following interpreter 
command directly ?

c:\_libs\zeppelin-0.6.2-bin-all\\bin\interpreter.cmd  -d 
c:\_libs\zeppelin-0.6.2-bin-all\interpreter\spark -p 53099 -l 
c:\_libs\zeppelin-0.6.2-bin-all\/local-repo/2C2ZNEH5W


Jan Botorek <jan.boto...@infor.com<mailto:jan.boto...@infor.com>>于2016年11月29日周二 
下午5:26写道:
I attach the log file after debugging turned on.

From: Jeff Zhang [mailto:zjf...@gmail.com<mailto:zjf...@gmail.com>]
Sent: Tuesday, November 29, 2016 10:04 AM

To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: Re: Unable to connect with Spark Interpreter

Then I guess the spark process is failed to start so no logs for spark 
interpreter.

Can you use the following log4.properties ? This log4j properties file print 
more error info for further diagnose.

log4j.rootLogger = INFO, dailyfile

log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%5p [%d] ({%t} %F[%M]:%L) - %m%n

log4j.appender.dailyfile.DatePattern=.yyyy-MM-dd
log4j.appender.dailyfile.Threshold = DEBUG
log4j.appender.dailyfile = org.apache.log4j.DailyRollingFileAppender
log4j.appender.dailyfile.File = ${zeppelin.log.file}
log4j.appender.dailyfile.layout = org.apache.log4j.PatternLayout
log4j.appender.dailyfile.layout.ConversionPattern=%5p [%d] ({%t} %F[%M]:%L) - 
%m%n


log4j.logger.org.apache.zeppelin.notebook.Paragraph=DEBUG
log4j.logger.org.apache.zeppelin.scheduler=DEBUG
log4j.logger.org.apache.zeppelin.livy=DEBUG
log4j.logger.org.apache.zeppelin.flink=DEBUG
log4j.logger.org.apache.zeppelin.interpreter.remote=DEBUG
log4j.logger.org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer=DEBUG



Jan Botorek <jan.boto...@infor.com<mailto:jan.boto...@infor.com>>于2016年11月29日周二 
下午4:57写道:
If I start Zeppelin by zeppelin.cmd, only zeppelin log appears.
Interpreter log is created only when I manually start the interpreter; but the 
log contains only information that the interpreter was started (see my 
preceding mail with attachment).

-          INFO [2016-11-29 08:43:59,757] ({Thread-0} 
RemoteInterpreterServer.java[run]:81) - Starting remote interpreter server on 
port 55492


From: Jeff Zhang [mailto:zjf...@gmail.com<mailto:zjf...@gmail.com>]
Sent: Tuesday, November 29, 2016 9:48 AM

To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: Re: Unable to connect with Spark Interpreter

According your log, the spark interpreter fail to start.  Do you see any spark 
interpreter log ?



Jan Botorek <jan.boto...@infor.com<mailto:jan.boto...@infor.com>>于2016年11月29日周二 
下午4:08写道:
Hello,
Thanks for the advice, but it doesn’t seem that anything is wrong when I start 
the interpreter manually. I attach logs from interpreter and from zeppelin.
This is the cmd output from interpreter launched manually:

D:\zeppelin-0.6.2\bin> interpreter.cmd -d D:\zeppelin-0.6.2\interpreter\spark 
-p 55492
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; 
support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/ 
D:/zeppelin-0.6.2/interpreter/spark/zeppelin-spark_2.11-0.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/ 
D:/zeppelin-0.6.2/lib/zeppelin-interpreter-0.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Could you, please, think of any possible next steps?

Best regards,
Jan

From: moon soo Lee [mailto:m...@apache.org<mailto:m...@apache.org>]
Sent: Monday, November 28, 2016 5:36 PM

To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: Re: Unable to connect with Spark Interpreter

According to your log, your interpreter process seems failed to start.
Check following lines in your log.
You can try run interpreter process manually and see why it is failing.
i.e. run

D:\zeppelin-0.6.2\bin\interpreter.cmd -d D:\zeppelin-0.6.2\interpreter\spark -p 
55492

-------

 INFO [2016-11-28 10:34:02,837] ({pool-1-thread-2} 
RemoteInterpreterProcess.java[reference]:148) - Run interpreter process 
[D:\zeppelin-0.6.2\bin\interpreter.cmd, -d, 
D:\zeppelin-0.6.2\interpreter\spark, -p, 55492, -l, 
D:\zeppelin-0.6.2/local-repo/2C36NT8YK]^M

 INFO [2016-11-28 10:34:03,491] ({Exec Default Executor} 
RemoteInterpreterProcess.java[onProcessFailed]:288) - Interpreter process 
failed {}^M

org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit 
value: 1)


On Mon, Nov 28, 2016 at 1:42 AM Jan Botorek 
<jan.boto...@infor.com<mailto:jan.boto...@infor.com>> wrote:
Hello again,
I am sorry, but don’t you, guys, really nobody tackle with the same issue, 
please?

I have currently tried the new version (0.6.2) – both binary and „to compile“ 
versions. But the issue remains the same. I have tried it on several laptops 
and servers, always the same result.

Please, don’t you have any idea what to check or repair, please?

Best regards,
Jan
From: Jan Botorek [mailto:jan.boto...@infor.com<mailto:jan.boto...@infor.com>]
Sent: Wednesday, November 16, 2016 12:54 PM
To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: RE: Unable to connect with Spark Interpreter

Hello Alexander,
Thank you for a quick response. Please, see the server log attached. 
Unfortunately, I don’t have any zeppelin-interpreter-spark*.log in the logs 
file.

Questions:

-          It happens everytime – even, If I try to run several paragraphs

-          Yes, it keeps happening even if the interpreter is re-started
--
Jan

From: Alexander Bezzubov [mailto:b...@apache.org]
Sent: Wednesday, November 16, 2016 12:47 PM
To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: Re: Unable to connect with Spark Interpreter

Hi Jan,

this is rather generic error saying that ZeppelinServer somehow could not 
connect to the interpreter proces on your machine.

Could you please share more from logs/* in particular, .out and .log of the 
Zeppelin server AND zepplein-interpreter-spark*.log - usually this is enough to 
identify the reason.

Two more questions:
- does this happen on every paragraph run? if you try to click Run multiple 
times in a row
- does it still happen if you re-starting Spark interpreter manually from GUI? 
("Anonymous"->Interpreters->Spark->restart)

--
Alex

On Wed, Nov 16, 2016, 12:37 Jan Botorek 
<jan.boto...@infor.com<mailto:jan.boto...@infor.com>> wrote:
Hello,
I am not able to run any Spark code in the Zeppelin. I tried compiled versions 
of Zeppelin as well as to compile the source code on my own based on the 
https://github.com/apache/zeppelin steps.
My configuration is Scala in 2.11 version and spark 2.0.1. Also, I tried 
different versions of Zeppelin available at github (master, 0.6, 0.5.6).

The result is always the same. The Zeppelin starts but when any code is run 
(e.g. “2 + 1”, “sc.version”), the subsequent exception is thrown.

java.net.ConnectException: Connection refused: connect at 
java.net.DualStackPlainSocketImpl.connect0(Native Method) at 
java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:79)
 at 
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at 
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
 at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) 
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172) at 
java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at 
java.net.Socket.connect(Socket.java:589) at 
org.apache.thrift.transport.TSocket.open(TSocket.java:182) at 
org.apache.zeppelin.interpreter.remote.ClientFactory.create(ClientFactory.java:51)
 at 
org.apache.zeppelin.interpreter.remote.ClientFactory.create(ClientFactory.java:37)
 at 
org.apache.commons.pool2.BasePooledObjectFactory.makeObject(BasePooledObjectFactory.java:60)
 at 
org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:861)
 at 
org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435)
 at 
org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterProcess.getClient(RemoteInterpreterProcess.java:189)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.init(RemoteInterpreter.java:163)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:328)
 at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.getFormType(LazyOpenInterpreter.java:105)
 at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:260) at 
org.apache.zeppelin.scheduler.Job.run(Job.java:176) at 
org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:328)
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at 
java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
 at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745)

Based on googling and my assumptions, there is something wrong with the spark 
interpreter in relation to the Zeppelin.
I also tried to connect the Spark interpreter to Spark running externally (in 
interpreter settings of Zeppelin), but it didn’t work.

Do you have any ideas about what could possibly be wrong?
Thank you for any help – any ideas and insights would be appreciated.

Best regards,
Jan Botorek

Reply via email to