People who do upstream builds of spark (think bigtop and hadoop distros)
are used to legacy systems like maven, so maven is the default build. I
don't think it will change.

Any improvements for the sbt build are of course welcome (it is still used
by many developers), but i would not do anything that increases the burden
of maintaining two build systems.
On Nov 5, 2015 18:38, "Jakob Odersky" <joder...@gmail.com> wrote:

> Hi everyone,
> in the process of learning Spark, I wanted to get an overview of the
> interaction between all of its sub-projects. I therefore decided to have a
> look at the build setup and its dependency management.
> Since I am alot more comfortable using sbt than maven, I decided to try to
> port the maven configuration to sbt (with the help of automated tools).
> This led me to a couple of observations and questions on the build system
> design:
>
> First, currently, there are two build systems, maven and sbt. Is there a
> preferred tool (or future direction to one)?
>
> Second, the sbt build also uses maven "profiles" requiring the use of
> specific commandline parameters when starting sbt. Furthermore, since it
> relies on maven poms, dependencies to the scala binary version (_2.xx) are
> hardcoded and require running an external script when switching versions.
> Sbt could leverage built-in constructs to support cross-compilation and
> emulate profiles with configurations and new build targets. This would
> remove external state from the build (in that no extra steps need to be
> performed in a particular order to generate artifacts for a new
> configuration) and therefore improve stability and build reproducibility
> (maybe even build performance). I was wondering if implementing such
> functionality for the sbt build would be welcome?
>
> thanks,
> --Jakob
>

Reply via email to