https://github.com/apache/spark/pull/3239 addresses this

On Thu, Nov 13, 2014 at 10:05 AM, Marcelo Vanzin <van...@cloudera.com>
wrote:

> Hello there,
>
> So I just took a quick look at the pom and I see two problems with it.
>
> - "activatedByDefault" does not work like you think it does. It only
> "activates by default" if you do not explicitly activate other
> profiles. So if you do "mvn package", scala-2.10 will be activated;
> but if you do "mvn -Pyarn package", it will not.
>
> - you need to duplicate the "activation" stuff everywhere where the
> profile is declared, not just in the root pom. (I spent quite some
> time yesterday fighting a similar issue...)
>
> My suggestion here is to change the activation of scala-2.10 to look like
> this:
>
> <activation>
>   <property>
>     <name>!scala-2.11</name>
>   </property>
> </activation>
>
> And change the scala-2.11 profile to do this:
>
> <properties>
>   <scala-2.11>true</scala-2.11>
> </properties>
>
> I haven't tested, but in my experience this will activate the
> scala-2.10 profile by default, unless you explicitly activate the 2.11
> profile, in which case that property will be set and scala-2.10 will
> not activate. If you look at examples/pom.xml, that's the same
> strategy used to choose which hbase profile to activate.
>
> Ah, and just to reinforce, the activation logic needs to be copied to
> other places (e.g. examples/pom.xml, repl/pom.xml, and any other place
> that has scala-2.x profiles).
>
>
>
> On Wed, Nov 12, 2014 at 11:14 PM, Patrick Wendell <pwend...@gmail.com>
> wrote:
> > I actually do agree with this - let's see if we can find a solution
> > that doesn't regress this behavior. Maybe we can simply move the one
> > kafka example into its own project instead of having it in the
> > examples project.
> >
> > On Wed, Nov 12, 2014 at 11:07 PM, Sandy Ryza <sandy.r...@cloudera.com>
> wrote:
> >> Currently there are no mandatory profiles required to build Spark.  I.e.
> >> "mvn package" just works.  It seems sad that we would need to break
> this.
> >>
> >> On Wed, Nov 12, 2014 at 10:59 PM, Patrick Wendell <pwend...@gmail.com>
> >> wrote:
> >>>
> >>> I think printing an error that says "-Pscala-2.10 must be enabled" is
> >>> probably okay. It's a slight regression but it's super obvious to
> >>> users. That could be a more elegant solution than the somewhat
> >>> complicated monstrosity I proposed on the JIRA.
> >>>
> >>> On Wed, Nov 12, 2014 at 10:37 PM, Prashant Sharma <
> scrapco...@gmail.com>
> >>> wrote:
> >>> > One thing we can do it is print a helpful error and break. I don't
> know
> >>> > about how this can be done, but since now I can write groovy inside
> >>> > maven
> >>> > build so we have more control. (Yay!!)
> >>> >
> >>> > Prashant Sharma
> >>> >
> >>> >
> >>> >
> >>> > On Thu, Nov 13, 2014 at 12:05 PM, Patrick Wendell <
> pwend...@gmail.com>
> >>> > wrote:
> >>> >>
> >>> >> Yeah Sandy and I were chatting about this today and din't realize
> >>> >> -Pscala-2.10 was mandatory. This is a fairly invasive change, so I
> was
> >>> >> thinking maybe we could try to remove that. Also if someone doesn't
> >>> >> give -Pscala-2.10 it fails in a way that is initially silent, which
> is
> >>> >> bad because most people won't know to do this.
> >>> >>
> >>> >> https://issues.apache.org/jira/browse/SPARK-4375
> >>> >>
> >>> >> On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma
> >>> >> <scrapco...@gmail.com>
> >>> >> wrote:
> >>> >> > Thanks Patrick, I have one suggestion that we should make passing
> >>> >> > -Pscala-2.10 mandatory for maven users. I am sorry for not
> mentioning
> >>> >> > this
> >>> >> > before. There is no way around not passing that option for maven
> >>> >> > users(only). However, this is unnecessary for sbt users because
> it is
> >>> >> > added
> >>> >> > automatically if -Pscala-2.11 is absent.
> >>> >> >
> >>> >> >
> >>> >> > Prashant Sharma
> >>> >> >
> >>> >> >
> >>> >> >
> >>> >> > On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen <so...@cloudera.com>
> >>> >> > wrote:
> >>> >> >
> >>> >> >> - Tip: when you rebase, IntelliJ will temporarily think things
> like
> >>> >> >> the
> >>> >> >> Kafka module are being removed. Say 'no' when it asks if you
> want to
> >>> >> >> remove
> >>> >> >> them.
> >>> >> >> - Can we go straight to Scala 2.11.4?
> >>> >> >>
> >>> >> >> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell
> >>> >> >> <pwend...@gmail.com>
> >>> >> >> wrote:
> >>> >> >>
> >>> >> >> > Hey All,
> >>> >> >> >
> >>> >> >> > I've just merged a patch that adds support for Scala 2.11 which
> >>> >> >> > will
> >>> >> >> > have some minor implications for the build. These are due to
> the
> >>> >> >> > complexities of supporting two versions of Scala in a single
> >>> >> >> > project.
> >>> >> >> >
> >>> >> >> > 1. The JDBC server will now require a special flag to build
> >>> >> >> > -Phive-thriftserver on top of the existing flag -Phive. This is
> >>> >> >> > because some build permutations (only in Scala 2.11) won't
> support
> >>> >> >> > the
> >>> >> >> > JDBC server yet due to transitive dependency conflicts.
> >>> >> >> >
> >>> >> >> > 2. The build now uses non-standard source layouts in a few
> >>> >> >> > additional
> >>> >> >> > places (we already did this for the Hive project) - the repl
> and
> >>> >> >> > the
> >>> >> >> > examples modules. This is just fine for maven/sbt, but it may
> >>> >> >> > affect
> >>> >> >> > users who import the build in IDE's that are using these
> projects
> >>> >> >> > and
> >>> >> >> > want to build Spark from the IDE. I'm going to update our wiki
> to
> >>> >> >> > include full instructions for making this work well in
> IntelliJ.
> >>> >> >> >
> >>> >> >> > If there are any other build related issues please respond to
> this
> >>> >> >> > thread and we'll make sure they get sorted out. Thanks to
> Prashant
> >>> >> >> > Sharma who is the author of this feature!
> >>> >> >> >
> >>> >> >> > - Patrick
> >>> >> >> >
> >>> >> >> >
> >>> >> >> >
> ---------------------------------------------------------------------
> >>> >> >> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >>> >> >> > For additional commands, e-mail: dev-h...@spark.apache.org
> >>> >> >> >
> >>> >> >> >
> >>> >> >>
> >>> >
> >>> >
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >>> For additional commands, e-mail: dev-h...@spark.apache.org
> >>>
> >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
>
>
>
> --
> Marcelo
>

Reply via email to