Thanks Marcelo, "this is just how Maven works (unfortunately)" answers my
question.

Another related question: I tried to use `mvn scala:cc` and discovered that
it only seems to work scan src/main and src/test directories (according to its
docs <http://scala-tools.org/mvnsites/maven-scala-plugin/usage_cc.html>),
and so can only be run from within submodules, not from the root directory.

I'll add a note about this to building-spark.html unless there is a way to
do it for all modules / from the root directory that I've missed. Let me
know!




On Tue Dec 02 2014 at 5:49:58 PM Marcelo Vanzin <van...@cloudera.com> wrote:

> On Tue, Dec 2, 2014 at 4:40 PM, Ryan Williams
> <ryan.blake.willi...@gmail.com> wrote:
> >> But you only need to compile the others once.
> >
> > once... every time I rebase off master, or am obliged to `mvn clean` by
> some
> > other build-correctness bug, as I said before. In my experience this
> works
> > out to a few times per week.
>
> No, you only need to do it something upstream from core changed (i.e.,
> spark-parent, network/common or network/shuffle) in an incompatible
> way. Otherwise, you can rebase and just recompile / retest core,
> without having to install everything else. I do this kind of thing all
> the time. If you have to do "mvn clean" often you're probably doing
> something wrong somewhere else.
>
> I understand where you're coming from, but the way you're thinking is
> just not how maven works. I too find annoying that maven requires lots
> of things to be "installed" before you can use them, when they're all
> part of the same project. But well, that's the way things are.
>
> --
> Marcelo
>

Reply via email to