I've cloned the github repo and I'm building Spark on a pretty beefy machine
(24 CPUs, 78GB of RAM) and it takes a pretty long time.
For instance, today I did a 'git pull' for the first time in a week or two, and
then doing 'sbt/sbt assembly' took 43 minutes of wallclock time (88 minutes of
that?
-Ken
*From:* Josh Rosen [mailto:rosenvi...@gmail.com]
*Sent:* Friday, April 25, 2014 3:27 PM
*To:* user@spark.apache.org
*Subject:* Re: Build times for Spark
Did you configure SBT to use the extra memory?
On Fri, Apr 25, 2014 at 12:53 PM, Williams, Ken
ken.willi
...@windlogics.com wrote:
No, I haven't done any config for SBT. Is there somewhere you might be
able to point me toward for how to do that?
-Ken
*From:* Josh Rosen [mailto:rosenvi...@gmail.com]
*Sent:* Friday, April 25, 2014 3:27 PM
*To:* user@spark.apache.org
*Subject:* Re: Build times
-server-2.2.0.jar
-Ken
*From:* Shivaram Venkataraman [mailto:shiva...@eecs.berkeley.edu]
*Sent:* Friday, April 25, 2014 4:31 PM
*To:* user@spark.apache.org
*Subject:* Re: Build times for Spark
Are you by any chance building this on NFS ? As far as I know the build is
severely