This isn't really answering the question, but for what it is worth, I
manage several different branches of Spark and publish custom named
versions regularly to an internal repository, and this is *much* easier
with SBT than with maven.  You can actually link the Spark SBT build into
an external SBT build and write commands that cross publish as needed.

For your case something as simple as build/sbt "set version in Global :=
'1.4.1-custom-string'" publish might do the trick.

On Tue, Aug 25, 2015 at 10:09 AM, Marcelo Vanzin <van...@cloudera.com>
wrote:

> On Tue, Aug 25, 2015 at 2:17 AM,  <andrew.row...@thomsonreuters.com>
> wrote:
> > Then, if I wanted to do a build against a specific profile, I could also
> > pass in a -Dspark.version=1.4.1-custom-string and have the output
> artifacts
> > correctly named. The default behaviour should be the same. Child pom
> files
> > would need to reference ${spark.version} in their parent section I think.
> >
> > Any objections to this?
>
> Have you tried it? My understanding is that no project does that
> because it doesn't work. To resolve properties you need to read the
> parent pom(s), and if there's a variable reference there, well, you
> can't do it. Chicken & egg.
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to