Github user bzz commented on the issue:
https://github.com/apache/zeppelin/pull/928
Let's focus on fixing CI profiles one by one.
[1st
profile](https://s3.amazonaws.com/archive.travis-ci.org/jobs/141500057/log.txt)
failure now is consistent with before (whish is a good sign - it's
reproducable!) and is due to failing tests in `ZeppelinSparkClusterTest` :
```
01:34:08,427 INFO org.apache.zeppelin.server.ZeppelinServer:133 - Bye
Results :
Failed tests:
ZeppelinSparkClusterTest.pySparkDepLoaderTest:231 expected:<FINISHED> but
was:<ERROR>
ZeppelinSparkClusterTest.pySparkAutoConvertOptionTest:152
expected:<FINISHED> but was:<ERROR>
ZeppelinSparkClusterTest.pySparkTest:127 expected:<FINISHED> but
was:<ERROR>
Tests run: 64, Failures: 3, Errors: 0, Skipped: 0
```
If we go up the logs to the origin, we will see mahout mentioned as well,
which is a sign that this failure is related to the PR.
And then the exception happens:
```
Running org.apache.zeppelin.rest.ZeppelinSparkClusterTest
01:32:21,442 INFO org.apache.zeppelin.rest.AbstractTestRestApi:116 - Test
Zeppelin stared.
SPARK HOME detected
/home/travis/build/apache/zeppelin/spark-1.6.1-bin-hadoop2.3
.....
01:32:50,555 INFO org.apache.zeppelin.notebook.Paragraph:252 - run
paragraph 20160701-013250_988735914 using dep
org.apache.zeppelin.interpreter.LazyOpenInterpreter@1c84bbdb
01:32:50,556 INFO
org.apache.zeppelin.interpreter.remote.RemoteInterpreterProcess:143 - Run
interpreter process [..//bin/interpreter.sh, -d, ../interpreter/mahout, -p,
35986, -l, ..//local-repo/2BRE4D28P]
..//bin/interpreter.sh: line 159:
/home/travis/build/apache/zeppelin/run/zeppelin-interpreter-mahout-travis-testing-worker-linux-docker-726dc48a-3376-linux-16.pid:
No such file or directory
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/home/travis/build/apache/zeppelin/interpreter/mahout/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/travis/build/apache/zeppelin/interpreter/mahout/zeppelin-spark-dependencies-0.6.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/travis/build/apache/zeppelin/interpreter/mahout/zeppelin-spark-0.6.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/travis/build/apache/zeppelin/zeppelin-interpreter/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
01:32:51,068 INFO
org.apache.zeppelin.interpreter.remote.RemoteInterpreter:170 - Create remote
interpreter org.apache.zeppelin.spark.DepInterpreter
01:32:51,110 ERROR org.apache.zeppelin.scheduler.RemoteScheduler:276 -
Can't get status information
org.apache.thrift.transport.TTransportException
at
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at
org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
at
org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
at
org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.recv_getStatus(RemoteInterpreterService.java:389)
at
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.getStatus(RemoteInterpreterService.java:375)
at
org.apache.zeppelin.scheduler.RemoteScheduler$JobStatusPoller.getStatus(RemoteScheduler.java:262)
at
org.apache.zeppelin.scheduler.RemoteScheduler$JobStatusPoller.run(RemoteScheduler.java:211)
```
The most suspicious part here is
```
1:32:50,554 INFO org.apache.zeppelin.interpreter.InterpreterFactory:608 -
Interpreter org.apache.zeppelin.spark.DepInterpreter 478460891 created
01:32:50,555 INFO org.apache.zeppelin.notebook.Paragraph:252 - run
paragraph 20160701-01:32:50,556 INFO
org.apache.zeppelin.interpreter.remote.RemoteInterpreterProcess:143 - Run
interpreter process [..//bin/interpreter.sh, -d, ../interpreter/mahout, -p,
35986, -l, ..//local-repo/2BRE4D28P]
..//bin/interpreter.sh: line 159:
/home/travis/build/apache/zeppelin/run/zeppelin-interpreter-mahout-travis-testing-worker-linux-docker-726dc48a-3376-linux-16.pid:
No such file or directory
...
01:32:51,068 INFO
org.apache.zeppelin.interpreter.remote.RemoteInterpreter:170 - Create remote
interpreter org.apache.zeppelin.spark.DepInterpreter
```
Could you confirm that scenarios, similar to one in
[ZeppelinSparkClusterTest.pySparkDepLoaderTest](https://github.com/apache/zeppelin/blob/master/zeppelin-server/src/test/java/org/apache/zeppelin/rest/ZeppelinSparkClusterTest.java#L218)
work, if you run those manually on local machine a new notebook?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---