We thought about this but elected not to do this for a few reasons.

1. Some people build from machines that do not have internet access
for security reasons and retrieve dependency from internal nexus
repositories. So having a build dependency that relies on internet
downloads is not desirable.

2. It's a hard to ensure stability of a particular URL in perpetuity.
This is why maven central and other mirror networks exist. Keep in
mind that we can't change the release code ever once we release it,
and if something changed about the particular URL it could break the
build.

- Patrick

On Sat, Jan 4, 2014 at 9:34 AM, Andrew Ash <and...@andrewash.com> wrote:
> +1 on bundling a script similar to that one
>
>
> On Sat, Jan 4, 2014 at 4:48 AM, Holden Karau <hol...@pigscanfly.ca> wrote:
>
>> Could we ship a shell script which downloads the sbt jar if not present
>> (like for example https://github.com/holdenk/slashem/blob/master/sbt )?
>>
>>
>> On Sat, Jan 4, 2014 at 12:02 AM, Patrick Wendell <pwend...@gmail.com>
>> wrote:
>>
>> > Hey All,
>> >
>> > Due to an ASF requirement, we recently merged a patch which removes
>> > the sbt jar from the build. This is necessary because we aren't
>> > allowed to distributed binary artifacts with our source packages.
>> >
>> > This means that instead of building Spark with "sbt/sbt XXX", you'll
>> > need to have sbt yourself and just run "sbt XXX" from within the Spark
>> > directory. This is similar to the maven build, where we expect users
>> > already have maven installed.
>> >
>> > You can download sbt at http://www.scala-sbt.org/. It's okay to just
>> > download the most recent version of sbt, since sbt knows how to fetch
>> > other versions of itself and will always use the one we specify in our
>> > build file to compile spark.
>> >
>> > - Patrick
>> >
>>
>>
>>
>> --
>> Cell : 425-233-8271
>>

Reply via email to