On Tue Dec 02 2014 at 4:46:20 PM Marcelo Vanzin <van...@cloudera.com> wrote:

> On Tue, Dec 2, 2014 at 3:39 PM, Ryan Williams
> <ryan.blake.willi...@gmail.com> wrote:
> > Marcelo: by my count, there are 19 maven modules in the codebase. I am
> > typically only concerned with "core" (and therefore its two dependencies
> as
> > well, `network/{shuffle,common}`).
>
> But you only need to compile the others once.


once... every time I rebase off master, or am obliged to `mvn clean` by
some other build-correctness bug, as I said before. In my experience this
works out to a few times per week.


> Once you've established
> the baseline, you can just compile / test "core" to your heart's
> desire.


I understand that this is a workflow that does what I want as a side effect
of doing 3-5x more work (depending whether you count [number of modules
built] or [lines of scala/java compiled]), none of the extra work being
useful to me (more on that below).


> Core tests won't even run until you build the assembly anyway,
> since some of them require the assembly to be present.


The tests you refer to are exactly the ones that I'd like to let Jenkins
run from here on out, per advice I was given elsewhere in this thread and
due to the myriad unpleasantries I've encountered in trying to run them
myself.


>
> Also, even if you work in core - I'd say especially if you work in
> core - you should still, at some point, compile and test everything
> else that depends on it.
>

Last response applies.


>
> So, do this ONCE:
>

again, s/ONCE/several times a week/, in my experience.


>
>   mvn install -DskipTests
>
> Then do this as many times as you want:
>
>   mvn -pl spark-core_2.10 something
>
> That doesn't seem too bad to me.

(Be aware of the "assembly" comment
> above, since testing spark-core means you may have to rebuild the
> assembly from time to time, if your changes affect those tests.)
>
> > re: Marcelo's comment about "missing the 'spark-parent' project", I saw
> that
> > error message too and tried to ascertain what it could mean. Why would
> > `network/shuffle` need something from the parent project?
>
> The "spark-parent" project is the main pom that defines dependencies
> and their version, along with lots of build plugins and
> configurations. It's needed by all modules to compile correctly.
>

- I understand the parent POM has that information.

- I don't understand why Maven would feel that it is unable to compile the
`network/shuffle` module without having first compiled, packaged, and
installed 17 modules (19 minus `network/shuffle` and its dependency
`network/common`) that are not transitive dependencies of `network/shuffle`.

- I am trying to understand whether my failure to get Maven to compile
`network/shuffle` stems from my not knowing the correct incantation to feed
to Maven or from Maven's having a different (and seemingly worse) model for
how it handles module dependencies than I expected.



>
> --
> Marcelo
>

Reply via email to