Re: Building spark submodule source code

2016-03-21 Thread Jakob Odersky
Another gotcha to watch out for are the SPARK_* environment variables. Have you exported SPARK_HOME? In that case, 'spark-shell' will use Spark from the variable, regardless of the place the script is called from. I.e. if SPARK_HOME points to a release version of Spark, your code changes will

Re: Building spark submodule source code

2016-03-21 Thread Akhil Das
Have a look at the intellij setup https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ Once you have the setup ready, you don't have to recompile the whole stuff every time. Thanks Best Regards On Mon, Mar 21, 2016 at 8:14 AM, Tenghuan He

Re: Building spark submodule source code

2016-03-20 Thread Ted Yu
To speed up the build process, take a look at install_zinc() in build/mvn, around line 83. And the following around line 137: # Now that zinc is ensured to be installed, check its status and, if its # not running or just installed, start it FYI On Sun, Mar 20, 2016 at 7:44 PM, Tenghuan He

Building spark submodule source code

2016-03-20 Thread Tenghuan He
Hi everyone, I am trying to add a new method to spark RDD. After changing the code of RDD.scala and running the following command mvn -pl :spark-core_2.10 -DskipTests clean install It BUILD SUCCESS, however, when starting the bin\spark-shell, my method cannot be found. Do I have