Another gotcha to watch out for are the SPARK_* environment variables.
Have you exported SPARK_HOME? In that case, 'spark-shell' will use
Spark from the variable, regardless of the place the script is called
from.
I.e. if SPARK_HOME points to a release version of Spark, your code
changes will never be available by simply running 'spark-shell'.

On Sun, Mar 20, 2016 at 11:23 PM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
> Have a look at the intellij setup
> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ
> Once you have the setup ready, you don't have to recompile the whole stuff
> every time.
>
> Thanks
> Best Regards
>
> On Mon, Mar 21, 2016 at 8:14 AM, Tenghuan He <tenghua...@gmail.com> wrote:
>>
>> Hi everyone,
>>
>>     I am trying to add a new method to spark RDD. After changing the code
>> of RDD.scala and running the following command
>>     mvn -pl :spark-core_2.10 -DskipTests clean install
>>     It BUILD SUCCESS, however, when starting the bin\spark-shell, my
>> method cannot be found.
>>     Do I have to rebuild the whole spark project instead the spark-core
>> submodule to make the changes work?
>>     Rebuiling the whole project is too time consuming, is there any better
>> choice?
>>
>>
>> Thanks & Best Regards
>>
>> Tenghuan He
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to