Github user GezimSejdiu commented on the issue:

    https://github.com/apache/zeppelin/pull/3253
  
    Hi @HyukjinKwon ,
    any news about this PR? Does it support Spark 2.4.0 (Scala 2.11.x) 
interpreter already? I just created a new branch which builds a [customized 
zeppelin 
docker](https://github.com/big-data-europe/docker-zeppelin/tree/0.0.1-zeppelin-0.8.0-hadoop-2.8.0-spark-2.4.0)
 based on [BDE spark docker](https://github.com/big-data-europe/docker-spark) 
and while testing it on [SANSA](https://github.com/SANSA-Stack) via 
[SANSA-Notebooks](https://github.com/SANSA-Stack/SANSA-Notebooks), found out 
that it does not work with Spark 2.4.0 :(. (see the stack trace below) : 
    ```shell
    java.lang.NoSuchMethodException: 
scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$loopPostInit()
        at java.lang.Class.getMethod(Class.java:1786)
        at 
org.apache.zeppelin.spark.BaseSparkScalaInterpreter.callMethod(BaseSparkScalaInterpreter.scala:268)
        at 
org.apache.zeppelin.spark.BaseSparkScalaInterpreter.callMethod(BaseSparkScalaInterpreter.scala:262)
        at 
org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:84)
        at 
org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
        at 
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62)
        at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
        at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:617)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:188)
        at 
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)
    ```
    
    Is there a plan to have that support soon on the new release? or we have to 
downgrade and use Spark 2.3.x instead ?
    
    Looking forward to hearing from you.
    
    Best regards,


---

Reply via email to