Github user smusevic commented on the pull request: https://github.com/apache/incubator-zeppelin/pull/270#issuecomment-190727451 Hello, I'm testing out `zeppelin-0.5.6-incubating-bin-all.tgz`. I might be wrong but it seems to me that this change causes: ``` SPARK_CLASSPATH was detected (set to ':/etc/hbase/conf'). This is deprecated in Spark 1.0+. Please instead use: - ./spark-submit with --driver-class-path to augment the driver classpath - spark.executor.extraClassPath to augment the executor classpath 16/03/01 08:11:50 WARN spark.SparkConf: Setting 'spark.executor.extraClassPath' to ':/etc/hbase/conf' as a work-around. 16/03/01 08:11:50 ERROR spark.SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former. at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$8.apply(SparkConf.scala:473) at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$8.apply(SparkConf.scala:471) at scala.collection.immutable.List.foreach(List.scala:318) at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:471) at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:459) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:459) at org.apache.spark.SparkContext.<init>(SparkContext.scala:391) at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:339) at org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:145) at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:465) at org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:300) at org.apache.zeppelin.scheduler.Job.run(Job.java:169) at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:134) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) ``` when `conf/zeppelin-env.sh` contains `export SPARK_HOME=...`. Removing the following text from added line `138` in `bin/interpreter.sh`: ``` --driver-class-path "${ZEPPELIN_CLASSPATH_OVERRIDES}:${CLASSPATH}" ``` the issue is resolved, as suggested by this [email](https://mail-archives.apache.org/mod_mbox/incubator-zeppelin-users/201509.mbox/%3CCADRmTZK5giPRhwjFqaQP_1PL08QWyRMTA=XhDg=z59nwpw3...@mail.gmail.com%3E) but something else happens: ``` | z <console>:22: error: not found: value z z ^ ``` which sadly blocks me from using `z.load("path/to/jar")` which is what I really need to do. Please note that I do not have access to change any of the files inside `SPARK_HOME`, including any `conf` files residing therein. Is there a workaround for this? Am I doing something wrong? Thanks in advance! S.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---