How to call synchronous REST API to run notebook

2018-02-22 Thread danoomistmatiste
Hi,  I am having issues calling the REST API to call a notebook
synchronously.  It keeps retuning 500 error along with a long java stack
trace.  I am trying to invoke it from the command line with CURL and the
notebook ID.

Any examples to run this from command line and Java will be much
appreciated.  I am using zeppelin v0.7.3.


Jar dependencies are not reloaded when Spark interpreter is restarted?

2018-02-22 Thread Partridge, Lucas (GE Aviation)
I only change the content of the jar, not the name or version of the jar 
(otherwise I’d have to re-add it as a dependency anyway).  Or do you mean 
something else by ’version’?

This dependency is a local file. Zeppelin and Spark are all running on the same 
machine. So I’m just specifying the file system path of the jar; it’s not even 
prefixed with file:///.

From: Jhon Anderson Cardenas Diaz [mailto:jhonderson2...@gmail.com]
Sent: 22 February 2018 12:18
To: users@zeppelin.apache.org
Subject: EXT: Re: Jar dependencies are not reloaded when Spark interpreter is 
restarted?

When you say you change the dependency, is only about the content? Or content 
and version. I think the dependency should be reloaded only if its version 
change.

I do not think it's optimal to re-download the dependencies every time the 
interpreter reboots.

El 22 feb. 2018 05:22, "Partridge, Lucas (GE Aviation)" 
> escribió:
I’m using Zeppelin 0.7.3 against a local standalone Spark ‘cluster’. I’ve added 
a Scala jar dependency to my Spark interpreter using Zeppelin’s UI. I thought 
if I changed my Scala code and updated the jar (using sbt outside of Zeppelin) 
then all I’d have to do is restart the interpreter for the new code to be 
picked up in Zeppelin in a regular scala paragraph.  However restarting the 
interpreter appears to have no effect – the new code is not detected. Is that 
expected behaviour or a bug?

The workaround I’m using at the moment is to edit the spark interpreter, remove 
the jar, re-add it, save the changes and then restart the interpreter. Clumsy 
but that’s better than restarting Zeppelin altogether.

Also, if anyone knows of a better way to reload code without restarting the 
interpreter then I’m open to suggestions:). Having to re-run lots of paragraphs 
after a restart is pretty tedious.

Thanks, Lucas.



Re: Jar dependencies are not reloaded when Spark interpreter is restarted?

2018-02-22 Thread Jhon Anderson Cardenas Diaz
When you say you change the dependency, is only about the content? Or
content and version. I think the dependency should be reloaded only if its
version change.

I do not think it's optimal to re-download the dependencies every time the
interpreter reboots.

El 22 feb. 2018 05:22, "Partridge, Lucas (GE Aviation)" <
lucas.partri...@ge.com> escribió:

> I’m using Zeppelin 0.7.3 against a local standalone Spark ‘cluster’. I’ve
> added a Scala jar dependency to my Spark interpreter using Zeppelin’s UI. I
> thought if I changed my Scala code and updated the jar (using sbt outside
> of Zeppelin) then all I’d have to do is restart the interpreter for the new
> code to be picked up in Zeppelin in a regular scala paragraph.  However
> restarting the interpreter appears to have no effect – the new code is not
> detected. Is that expected behaviour or a bug?
>
>
>
> The workaround I’m using at the moment is to edit the spark interpreter,
> remove the jar, re-add it, save the changes and then restart the
> interpreter. Clumsy but that’s better than restarting Zeppelin altogether.
>
>
>
> Also, if anyone knows of a better way to reload code without restarting
> the interpreter then I’m open to suggestions:). Having to re-run lots of
> paragraphs after a restart is pretty tedious.
>
>
>
> Thanks, Lucas.
>
>
>


Jar dependencies are not reloaded when Spark interpreter is restarted?

2018-02-22 Thread Partridge, Lucas (GE Aviation)
I'm using Zeppelin 0.7.3 against a local standalone Spark 'cluster'. I've added 
a Scala jar dependency to my Spark interpreter using Zeppelin's UI. I thought 
if I changed my Scala code and updated the jar (using sbt outside of Zeppelin) 
then all I'd have to do is restart the interpreter for the new code to be 
picked up in Zeppelin in a regular scala paragraph.  However restarting the 
interpreter appears to have no effect - the new code is not detected. Is that 
expected behaviour or a bug?

The workaround I'm using at the moment is to edit the spark interpreter, remove 
the jar, re-add it, save the changes and then restart the interpreter. Clumsy 
but that's better than restarting Zeppelin altogether.

Also, if anyone knows of a better way to reload code without restarting the 
interpreter then I'm open to suggestions:). Having to re-run lots of paragraphs 
after a restart is pretty tedious.

Thanks, Lucas.