Hi Piotr, One possibl approach without big effort i think is, let your application get SparkContext from argument, and instantiate your application inside of Zeppelin notebook with SparkContext that Zeppelin provides.
Or, your application might able to embed Zeppelin with following considerations - include zeppelin-interpreter, zeppelin-zengine, zeppelin-server, zeppelin-web, spark, spark-dependencies as dependency - start ZeppelinServer, NotebookServer and all necessary stuff to use Zeppelin. - need to create interpreter setting programmatically to run Spark interpreter in the same JVM. - Programatically open (initialize) the interpreter and get SparkContext from it - Initialize your application with the SparkContext - whenever you need interactive notebook, you can creates new notebook and use it. Both approach, the key is make share the same instance of SparkContext between Zeppelin and your application . Hope this helps. Best, moon On 2015년 10월 7일 (수) at 오후 5:08 Reszke Piotr <pres...@gmail.com> wrote: > Hi, > > I'm trying to add a "debug" mode to my existing spark application. > If certain condition occurs in my app running on Spark, I would like the > app to wait for commands from zeppelin notebook. > Zepellin notebook should work interactively at this stage (i.e. allow me > to view data-frames generated so far, do calculations etc.) > Could I reuse some of the zeppelin components - how big is the effort to > achieve such a functionality? > > Thank you > Piotr >