Hi, 

I'm working on a Spark package involving both Scala and Python development
and I'm trying to figure out the right dev setup. This will be a private
internal package, at least for now, so I'm not concerned about publishing.

I've been using the sbt-spark-package plugin on the scala side, which works
great for managing the scala spark dependencies, but I'm not clear how to
get something similar setup with Python. Ideally I would like an easy way to
get a python shell and run python tests against spark with the jars built
from the scala side. We plan to implement some non-trivial code in Python so
it's important that it's part of the dev/test loop. 

We started out having developers install Spark separately on their machines
(e.g. using homebrew) and use pyspark from there, but it doesn't seem ideal
to have two different sets of spark jars (one from the build.sbt
dependencies and one from the system). Is there anyway to get this to work
with the dependencies from sbt-spark-package?

Is there a better setup or workflow we should be using for multi-lingual
packages? 

Thanks,
Ben 




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Dev-Setup-for-Python-Scala-Packages-tp25035.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to