You could consider using Zeppelin - 
https://zeppelin.incubator.apache.org/docs/latest/interpreter/spark.html 
<https://zeppelin.incubator.apache.org/docs/latest/interpreter/spark.html>

https://zeppelin.incubator.apache.org/ <https://zeppelin.incubator.apache.org/>

ZeppelinContext
Zeppelin automatically injects ZeppelinContext as variable 'z' in your 
scala/python environment. ZeppelinContext provides some additional functions 
and utility.



Object exchange

ZeppelinContext extends map and it's shared between scala, python environment. 
So you can put some object from scala and read it from python, vise versa.

Put object from scala

%spark
val myObject = ...
z.put("objName", myObject)
Get object from python

%python
myObject = z.get("objName")


> On Feb 15, 2016, at 12:10 PM, Leonid Blokhin <lblok...@provectus.com> wrote:
> 
> Hello
> 
>  I want to work with single context Spark from Python and Scala. Is it 
> possible?
> 
> Is it possible to do betwen started  ./bin/pyspark and ./bin/spark-shell for 
> dramatic example?
> 
> 
> 
> Cheers,
> 
> Leonid
> 

Reply via email to