put these lines in your ~/.bash_profile

export SPARK_PREPEND_CLASSES=true
export SPARK_HOME=path_to_spark
export 
PYTHONPATH="${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip:${SPARK_HOME}/python:${PYTHONPATH}"

$ source ~/.bash_profile
$ build/sbt assembly
$ build/sbt ~compile  # do not stop this

Then in another terminal you could run python tests as
$ cd python/pyspark/
$  python rdd.py


cc to dev list


On Fri, Mar 27, 2015 at 10:15 AM, Stephen Boesch <java...@gmail.com> wrote:
> Which aspect of that page are you suggesting provides a more optimized
> alternative?
>
> 2015-03-27 10:13 GMT-07:00 Davies Liu <dav...@databricks.com>:
>
>> see
>> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools
>>
>> On Fri, Mar 27, 2015 at 10:02 AM, Stephen Boesch <java...@gmail.com>
>> wrote:
>> > I am iteratively making changes to the scala side of some new pyspark
>> > code
>> > and re-testing from the python/pyspark side.
>> >
>> > Presently my only solution is to rebuild completely
>> >
>> >       sbt assembly
>> >
>> > after any scala side change - no matter how small.
>> >
>> > Any better / expedited way for pyspark to see small scala side updates?
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to