Github user bzz commented on the issue:
https://github.com/apache/zeppelin/pull/928
### CI failure debug approach
First thing to do in order to debug such issues:
- link all raw build logs
- extact and systematize failure resaons from logs (usulaly it's last
'exited with 1' message + first exception)
- cross-check them with list of [flaky-tests in
JIRA](https://issues.apache.org/jira/browse/ZEPPELIN-862?jql=project%20%3D%20ZEPPELIN%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20labels%20%3D%20flaky-test)
Usually, after all this is done and posted - it is much easier for other
people to jump in and help in the drive-by reviews.
### CI failures in this PR
1. `ZeppelinSparkClusterTest` in Zeppelin Server failing
Profiles #
[1](https://api.travis-ci.org/jobs/141222530/log.txt?deansi=true),
[2](https://api.travis-ci.org/jobs/141222531/log.txt?deansi=true),
[3](https://api.travis-ci.org/jobs/141222532/log.txt?deansi=true),
```
Failed tests:
ZeppelinSparkClusterTest.pySparkDepLoaderTest:231 expected:<FINISHED> but
was:<ERROR>
ZeppelinSparkClusterTest.pySparkAutoConvertOptionTest:152
expected:<FINISHED> but was:<ERROR>
ZeppelinSparkClusterTest.pySparkTest:127 expected:<FINISHED> but
was:<ERROR>
Tests run: 63, Failures: 3, Errors: 0, Skipped: 0
```
Profile [4](https://api.travis-ci.org/jobs/141222533/log.txt?deansi=true)
has only single test failure
```
Results :
Failed tests:
ZeppelinSparkClusterTest.pySparkTest:127 expected:<FINISHED> but
was:<ERROR>
Tests run: 40, Failures: 1, Errors: 0, Skipped: 0
```
Profile [6](https://api.travis-ci.org/jobs/141222535/log.txt?deansi=true)
have even more tests
```
Running org.apache.zeppelin.rest.ZeppelinSparkClusterTest
....
5:06:37,836 INFO org.apache.zeppelin.notebook.Paragraph:252 - run
paragraph 20160630-050637_1289828172 using spark
org.apache.zeppelin.interpreter.LazyOpenInterpreter@62b97ff1
Exception in thread "pool-1-thread-3" java.lang.IllegalAccessError: tried
to access method
com.google.common.collect.MapMaker.softValues()Lcom/google/common/collect/MapMaker;
from class org.apache.spark.SparkEnv
at org.apache.spark.SparkEnv.<init>(SparkEnv.scala:75)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:272)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:204)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:338)
at
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:122)
at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:513)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.getProgress(LazyOpenInterpreter.java:110)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.getProgress(RemoteInterpreterServer.java:404)
at
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1509)
at
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1494)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
.....
Failed tests:
ZeppelinSparkClusterTest.pySparkDepLoaderTest:195->getSparkVersionNumber:250
expected:<FINISHED> but was:<ERROR>
ZeppelinSparkClusterTest.pySparkAutoConvertOptionTest:139->getSparkVersionNumber:250
expected:<FINISHED> but was:<ERROR>
ZeppelinSparkClusterTest.basicRDDTransformationAndActionTest:81
expected:<FINISHED> but was:<ERROR>
ZeppelinSparkClusterTest.pySparkTest:115->getSparkVersionNumber:250
expected:<FINISHED> but was:<ERROR>
ZeppelinSparkClusterTest.zRunTest:180 expected:<FINISHED> but was:<ERROR>
ZeppelinSparkClusterTest.sparkRTest:90->getSparkVersionNumber:250
expected:<FINISHED> but was:<ERROR>
```
1. `ZeppelinSparkClusterTest` hangs AKA
[ZEPPELIN-862](https://issues.apache.org/jira/browse/ZEPPELIN-862)
Profile [5](https://api.travis-ci.org/jobs/141222534/log.txt?deansi=true)
```
Running org.apache.zeppelin.rest.ZeppelinSparkClusterTest
...
No output has been received in the last 10 minutes, this potentially
indicates a stalled build or something wrong with the build itself.
The build has been terminated
```
This guys is known and flaky, so it's better to focus on the rest issues
for now.
1. `SparkParagraphIT.testPySpark` failure - never seen before, might
deserve a separate JIRA issue with `flaky-test` label.
Profile [7](https://api.travis-ci.org/jobs/141222536/log.txt?deansi=true)
fails in somehow a new way
```
Results :
Failed tests:
SparkParagraphIT.testPySpark:132 Paragraph from SparkParagraphIT of
testPySpark status:
Expected: "FINISHED"
but: was "ERROR"
SparkParagraphIT.testPySpark:139 Paragraph from SparkParagraphIT of
testPySpark result:
Expected: "test loop 0\ntest loop 1\ntest loop 2"
but: was "Traceback (most recent call last):\n File
\"/tmp/zeppelin_pyspark-8754372370659284789.py\", line 20, in <module>\n
from py4j.java_gateway import java_import, JavaGateway,
GatewayClient\nImportError: No module named py4j.java_gateway\npyspark is not
responding Traceback (most recent call last):\n File
\"/tmp/zeppelin_pyspark-8754372370659284789.py\", line 20, in <module>\n
from py4j.java_gateway import java_import, JavaGateway,
GatewayClient\nImportError: No module named py4j.java_gateway"
Tests run: 16, Failures: 2, Errors: 0, Skipped: 0
```
Please feel free to add, in case I'm missing something here.
---------------------------
From the first glance `java.lang.IllegalAccessError: tried to access
method` - means that there is different version of some class in run-time, from
the one you expect\have in compile time.
This happens a lot when transitive dependencies bring new versions of the
library you already have.
I would start looking at Guava dependency version that you have in `mvn
dependency:tree` to determine the "offender".
Hope this helps @rawkintrevo !
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---