For some more context, I started the zeppelin daemon using:
$ZEPPELIN_HOME/bin/zeppelin-daemon.sh --config $HOME/zeppelin_conf start

I can run  spark code in the notebook but the dependency loader throws
these errors.

On Wed, Aug 19, 2015 at 2:18 PM, Udit Mehta <ume...@groupon.com> wrote:

> Hi,
>
> I am trying to load a spark dependency using the following in my zeppelin
> notebook:
>
>
>
> *%depz.load("com.databricks:spark-csv_2.10:1.0.3")*
> Doing this I get the following error:
>
> ERROR [2015-08-19 21:11:52,511] ({pool-2-thread-2} Job.java[run]:183) -
> Job failed
> org.apache.zeppelin.interpreter.InterpreterException:
> java.lang.NullPointerException
>         at
> org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:60)
>         at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
>         at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:277)
>         at org.apache.zeppelin.scheduler.Job.run(Job.java:170)
>         at
> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
>         at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
>         at
> org.sonatype.aether.impl.internal.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:352)
>         at
> org.apache.zeppelin.spark.dep.DependencyContext.fetchArtifactWithDep(DependencyContext.java:141)
>         at
> org.apache.zeppelin.spark.dep.DependencyContext.fetch(DependencyContext.java:98)
>         at
> org.apache.zeppelin.spark.DepInterpreter.interpret(DepInterpreter.java:189)
>         at
> org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
>         ... 11 more
>
> What am I doing wrong? Do I need to do anything before loading the deps?
>
> Udit
>

Reply via email to