One thing I created a JIRA for a while back was to have a similar
script to "sbt/sbt" that transparently downloads Zinc, Scala, and
Maven in a subdirectory of Spark and sets it up correctly. I.e.
"build/mvn".

Outside of brew for MacOS there aren't good Zinc packages, and it's a
pain to figure out how to set it up.

https://issues.apache.org/jira/browse/SPARK-4501

Prashant Sharma looked at this for a bit but I don't think he's
working on it actively any more, so if someone wanted to do this, I'd
be extremely grateful.

- Patrick

On Fri, Dec 5, 2014 at 11:05 AM, Ryan Williams
<ryan.blake.willi...@gmail.com> wrote:
> fwiw I've been using `zinc -scala-home $SCALA_HOME -nailed -start` which:
>
> - starts a nailgun server as well,
> - uses my installed scala 2.{10,11}, as opposed to zinc's default 2.9.2
> <https://github.com/typesafehub/zinc#scala>: "If no options are passed to
> locate a version of Scala then Scala 2.9.2 is used by default (which is
> bundled with zinc)."
>
> The latter seems like it might be especially important.
>
>
> On Thu Dec 04 2014 at 4:25:32 PM Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
>> Oh, derp. I just assumed from looking at all the options that there was
>> something to it. Thanks Sean.
>>
>> On Thu Dec 04 2014 at 7:47:33 AM Sean Owen <so...@cloudera.com> wrote:
>>
>> > You just run it once with "zinc -start" and leave it running as a
>> > background process on your build machine. You don't have to do
>> > anything for each build.
>> >
>> > On Wed, Dec 3, 2014 at 3:44 PM, Nicholas Chammas
>> > <nicholas.cham...@gmail.com> wrote:
>> > > https://github.com/apache/spark/blob/master/docs/
>> > building-spark.md#speeding-up-compilation-with-zinc
>> > >
>> > > Could someone summarize how they invoke zinc as part of a regular
>> > > build-test-etc. cycle?
>> > >
>> > > I'll add it in to the aforelinked page if appropriate.
>> > >
>> > > Nick
>> >
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to