Hi everyone,
in the process of learning Spark, I wanted to get an overview of the
interaction between all of its sub-projects. I therefore decided to have a
look at the build setup and its dependency management.
Since I am alot more comfortable using sbt than maven, I decided to try to
port the maven configuration to sbt (with the help of automated tools).
This led me to a couple of observations and questions on the build system
design:

First, currently, there are two build systems, maven and sbt. Is there a
preferred tool (or future direction to one)?

Second, the sbt build also uses maven "profiles" requiring the use of
specific commandline parameters when starting sbt. Furthermore, since it
relies on maven poms, dependencies to the scala binary version (_2.xx) are
hardcoded and require running an external script when switching versions.
Sbt could leverage built-in constructs to support cross-compilation and
emulate profiles with configurations and new build targets. This would
remove external state from the build (in that no extra steps need to be
performed in a particular order to generate artifacts for a new
configuration) and therefore improve stability and build reproducibility
(maybe even build performance). I was wondering if implementing such
functionality for the sbt build would be welcome?

thanks,
--Jakob

Reply via email to