Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-13 Thread Marcelo Vanzin
On Thu, Nov 13, 2014 at 10:58 AM, Patrick Wendell  wrote:
>> That's true, but note the code I posted activates a profile based on
>> the lack of a property being set, which is why it works. Granted, I
>> did not test that if you activate the other profile, the one with the
>> property check will be disabled.
>
> Ah yeah good call - I so then we'd trigger 2.11-vs-not based on the
> presence of -Dscala-2.11.
>
> Would that fix this issue then? It might be a simpler fix to merge
> into the 1.2 branch than Sandy's patch since we're pretty late in the
> game (though that patch does other things separately that I'd like to
> see end up in Spark soon).

Yeah, that's the idea. As for simplicity, I think Sandy's patch would
be just as simple if it avoided all the changes to isolate the
examples / external stuff into different profiles.

-- 
Marcelo

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-13 Thread Patrick Wendell
> That's true, but note the code I posted activates a profile based on
> the lack of a property being set, which is why it works. Granted, I
> did not test that if you activate the other profile, the one with the
> property check will be disabled.

Ah yeah good call - I so then we'd trigger 2.11-vs-not based on the
presence of -Dscala-2.11.

Would that fix this issue then? It might be a simpler fix to merge
into the 1.2 branch than Sandy's patch since we're pretty late in the
game (though that patch does other things separately that I'd like to
see end up in Spark soon).

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-13 Thread Marcelo Vanzin
Hey Patrick,

On Thu, Nov 13, 2014 at 10:49 AM, Patrick Wendell  wrote:
> I'm not sure chaining activation works like that. At least in my
> experience activation based on properties only works for properties
> explicitly specified at the command line rather than declared
> elsewhere in the pom.

That's true, but note the code I posted activates a profile based on
the lack of a property being set, which is why it works. Granted, I
did not test that if you activate the other profile, the one with the
property check will be disabled.

> I any case, I think Prashant just didn't document that his patch
> required -Pscala-2.10 explicitly, which is what he said further up in
> the thread. And Sandy has a solution that has better behavior than
> that, which is nice.

Yeah, I saw Sandy's patch now and it's probably a better solution
(since it doesn't abuse the sort of tricky maven profile stuff as
much).

-- 
Marcelo

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-13 Thread Patrick Wendell
Hey Marcelo,

I'm not sure chaining activation works like that. At least in my
experience activation based on properties only works for properties
explicitly specified at the command line rather than declared
elsewhere in the pom.

https://gist.github.com/pwendell/6834223e68f254e6945e

I any case, I think Prashant just didn't document that his patch
required -Pscala-2.10 explicitly, which is what he said further up in
the thread. And Sandy has a solution that has better behavior than
that, which is nice.

- Patrick

On Thu, Nov 13, 2014 at 10:15 AM, Sandy Ryza  wrote:
> https://github.com/apache/spark/pull/3239 addresses this
>
> On Thu, Nov 13, 2014 at 10:05 AM, Marcelo Vanzin 
> wrote:
>>
>> Hello there,
>>
>> So I just took a quick look at the pom and I see two problems with it.
>>
>> - "activatedByDefault" does not work like you think it does. It only
>> "activates by default" if you do not explicitly activate other
>> profiles. So if you do "mvn package", scala-2.10 will be activated;
>> but if you do "mvn -Pyarn package", it will not.
>>
>> - you need to duplicate the "activation" stuff everywhere where the
>> profile is declared, not just in the root pom. (I spent quite some
>> time yesterday fighting a similar issue...)
>>
>> My suggestion here is to change the activation of scala-2.10 to look like
>> this:
>>
>> 
>>   
>> !scala-2.11
>>   
>> 
>>
>> And change the scala-2.11 profile to do this:
>>
>> 
>>   true
>> 
>>
>> I haven't tested, but in my experience this will activate the
>> scala-2.10 profile by default, unless you explicitly activate the 2.11
>> profile, in which case that property will be set and scala-2.10 will
>> not activate. If you look at examples/pom.xml, that's the same
>> strategy used to choose which hbase profile to activate.
>>
>> Ah, and just to reinforce, the activation logic needs to be copied to
>> other places (e.g. examples/pom.xml, repl/pom.xml, and any other place
>> that has scala-2.x profiles).
>>
>>
>>
>> On Wed, Nov 12, 2014 at 11:14 PM, Patrick Wendell 
>> wrote:
>> > I actually do agree with this - let's see if we can find a solution
>> > that doesn't regress this behavior. Maybe we can simply move the one
>> > kafka example into its own project instead of having it in the
>> > examples project.
>> >
>> > On Wed, Nov 12, 2014 at 11:07 PM, Sandy Ryza 
>> > wrote:
>> >> Currently there are no mandatory profiles required to build Spark.
>> >> I.e.
>> >> "mvn package" just works.  It seems sad that we would need to break
>> >> this.
>> >>
>> >> On Wed, Nov 12, 2014 at 10:59 PM, Patrick Wendell 
>> >> wrote:
>> >>>
>> >>> I think printing an error that says "-Pscala-2.10 must be enabled" is
>> >>> probably okay. It's a slight regression but it's super obvious to
>> >>> users. That could be a more elegant solution than the somewhat
>> >>> complicated monstrosity I proposed on the JIRA.
>> >>>
>> >>> On Wed, Nov 12, 2014 at 10:37 PM, Prashant Sharma
>> >>> 
>> >>> wrote:
>> >>> > One thing we can do it is print a helpful error and break. I don't
>> >>> > know
>> >>> > about how this can be done, but since now I can write groovy inside
>> >>> > maven
>> >>> > build so we have more control. (Yay!!)
>> >>> >
>> >>> > Prashant Sharma
>> >>> >
>> >>> >
>> >>> >
>> >>> > On Thu, Nov 13, 2014 at 12:05 PM, Patrick Wendell
>> >>> > 
>> >>> > wrote:
>> >>> >>
>> >>> >> Yeah Sandy and I were chatting about this today and din't realize
>> >>> >> -Pscala-2.10 was mandatory. This is a fairly invasive change, so I
>> >>> >> was
>> >>> >> thinking maybe we could try to remove that. Also if someone doesn't
>> >>> >> give -Pscala-2.10 it fails in a way that is initially silent, which
>> >>> >> is
>> >>> >> bad because most people won't know to do this.
>> >>> >>
>> >>> >> https://issues.apache.org/jira/browse/SPARK-4375
>> >>> >>
>> >>> >> On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma
>> >>> >> 
>> >>> >> wrote:
>> >>> >> > Thanks Patrick, I have one suggestion that we should make passing
>> >>> >> > -Pscala-2.10 mandatory for maven users. I am sorry for not
>> >>> >> > mentioning
>> >>> >> > this
>> >>> >> > before. There is no way around not passing that option for maven
>> >>> >> > users(only). However, this is unnecessary for sbt users because
>> >>> >> > it is
>> >>> >> > added
>> >>> >> > automatically if -Pscala-2.11 is absent.
>> >>> >> >
>> >>> >> >
>> >>> >> > Prashant Sharma
>> >>> >> >
>> >>> >> >
>> >>> >> >
>> >>> >> > On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen 
>> >>> >> > wrote:
>> >>> >> >
>> >>> >> >> - Tip: when you rebase, IntelliJ will temporarily think things
>> >>> >> >> like
>> >>> >> >> the
>> >>> >> >> Kafka module are being removed. Say 'no' when it asks if you
>> >>> >> >> want to
>> >>> >> >> remove
>> >>> >> >> them.
>> >>> >> >> - Can we go straight to Scala 2.11.4?
>> >>> >> >>
>> >>> >> >> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell
>> >>> >> >> 
>> >>> >> >> wrote:
>> >>> >> >>
>> >>> >> >> > Hey All,
>> >>> >> >> >
>> >>> >> >> > I've ju

Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-13 Thread Sandy Ryza
https://github.com/apache/spark/pull/3239 addresses this

On Thu, Nov 13, 2014 at 10:05 AM, Marcelo Vanzin 
wrote:

> Hello there,
>
> So I just took a quick look at the pom and I see two problems with it.
>
> - "activatedByDefault" does not work like you think it does. It only
> "activates by default" if you do not explicitly activate other
> profiles. So if you do "mvn package", scala-2.10 will be activated;
> but if you do "mvn -Pyarn package", it will not.
>
> - you need to duplicate the "activation" stuff everywhere where the
> profile is declared, not just in the root pom. (I spent quite some
> time yesterday fighting a similar issue...)
>
> My suggestion here is to change the activation of scala-2.10 to look like
> this:
>
> 
>   
> !scala-2.11
>   
> 
>
> And change the scala-2.11 profile to do this:
>
> 
>   true
> 
>
> I haven't tested, but in my experience this will activate the
> scala-2.10 profile by default, unless you explicitly activate the 2.11
> profile, in which case that property will be set and scala-2.10 will
> not activate. If you look at examples/pom.xml, that's the same
> strategy used to choose which hbase profile to activate.
>
> Ah, and just to reinforce, the activation logic needs to be copied to
> other places (e.g. examples/pom.xml, repl/pom.xml, and any other place
> that has scala-2.x profiles).
>
>
>
> On Wed, Nov 12, 2014 at 11:14 PM, Patrick Wendell 
> wrote:
> > I actually do agree with this - let's see if we can find a solution
> > that doesn't regress this behavior. Maybe we can simply move the one
> > kafka example into its own project instead of having it in the
> > examples project.
> >
> > On Wed, Nov 12, 2014 at 11:07 PM, Sandy Ryza 
> wrote:
> >> Currently there are no mandatory profiles required to build Spark.  I.e.
> >> "mvn package" just works.  It seems sad that we would need to break
> this.
> >>
> >> On Wed, Nov 12, 2014 at 10:59 PM, Patrick Wendell 
> >> wrote:
> >>>
> >>> I think printing an error that says "-Pscala-2.10 must be enabled" is
> >>> probably okay. It's a slight regression but it's super obvious to
> >>> users. That could be a more elegant solution than the somewhat
> >>> complicated monstrosity I proposed on the JIRA.
> >>>
> >>> On Wed, Nov 12, 2014 at 10:37 PM, Prashant Sharma <
> scrapco...@gmail.com>
> >>> wrote:
> >>> > One thing we can do it is print a helpful error and break. I don't
> know
> >>> > about how this can be done, but since now I can write groovy inside
> >>> > maven
> >>> > build so we have more control. (Yay!!)
> >>> >
> >>> > Prashant Sharma
> >>> >
> >>> >
> >>> >
> >>> > On Thu, Nov 13, 2014 at 12:05 PM, Patrick Wendell <
> pwend...@gmail.com>
> >>> > wrote:
> >>> >>
> >>> >> Yeah Sandy and I were chatting about this today and din't realize
> >>> >> -Pscala-2.10 was mandatory. This is a fairly invasive change, so I
> was
> >>> >> thinking maybe we could try to remove that. Also if someone doesn't
> >>> >> give -Pscala-2.10 it fails in a way that is initially silent, which
> is
> >>> >> bad because most people won't know to do this.
> >>> >>
> >>> >> https://issues.apache.org/jira/browse/SPARK-4375
> >>> >>
> >>> >> On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma
> >>> >> 
> >>> >> wrote:
> >>> >> > Thanks Patrick, I have one suggestion that we should make passing
> >>> >> > -Pscala-2.10 mandatory for maven users. I am sorry for not
> mentioning
> >>> >> > this
> >>> >> > before. There is no way around not passing that option for maven
> >>> >> > users(only). However, this is unnecessary for sbt users because
> it is
> >>> >> > added
> >>> >> > automatically if -Pscala-2.11 is absent.
> >>> >> >
> >>> >> >
> >>> >> > Prashant Sharma
> >>> >> >
> >>> >> >
> >>> >> >
> >>> >> > On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen 
> >>> >> > wrote:
> >>> >> >
> >>> >> >> - Tip: when you rebase, IntelliJ will temporarily think things
> like
> >>> >> >> the
> >>> >> >> Kafka module are being removed. Say 'no' when it asks if you
> want to
> >>> >> >> remove
> >>> >> >> them.
> >>> >> >> - Can we go straight to Scala 2.11.4?
> >>> >> >>
> >>> >> >> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell
> >>> >> >> 
> >>> >> >> wrote:
> >>> >> >>
> >>> >> >> > Hey All,
> >>> >> >> >
> >>> >> >> > I've just merged a patch that adds support for Scala 2.11 which
> >>> >> >> > will
> >>> >> >> > have some minor implications for the build. These are due to
> the
> >>> >> >> > complexities of supporting two versions of Scala in a single
> >>> >> >> > project.
> >>> >> >> >
> >>> >> >> > 1. The JDBC server will now require a special flag to build
> >>> >> >> > -Phive-thriftserver on top of the existing flag -Phive. This is
> >>> >> >> > because some build permutations (only in Scala 2.11) won't
> support
> >>> >> >> > the
> >>> >> >> > JDBC server yet due to transitive dependency conflicts.
> >>> >> >> >
> >>> >> >> > 2. The build now uses non-standard source layouts in a few
> >>> >> >> > additional
> >>> >> >> > places (we already did th

Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-13 Thread Marcelo Vanzin
Hello there,

So I just took a quick look at the pom and I see two problems with it.

- "activatedByDefault" does not work like you think it does. It only
"activates by default" if you do not explicitly activate other
profiles. So if you do "mvn package", scala-2.10 will be activated;
but if you do "mvn -Pyarn package", it will not.

- you need to duplicate the "activation" stuff everywhere where the
profile is declared, not just in the root pom. (I spent quite some
time yesterday fighting a similar issue...)

My suggestion here is to change the activation of scala-2.10 to look like this:


  
!scala-2.11
  


And change the scala-2.11 profile to do this:


  true


I haven't tested, but in my experience this will activate the
scala-2.10 profile by default, unless you explicitly activate the 2.11
profile, in which case that property will be set and scala-2.10 will
not activate. If you look at examples/pom.xml, that's the same
strategy used to choose which hbase profile to activate.

Ah, and just to reinforce, the activation logic needs to be copied to
other places (e.g. examples/pom.xml, repl/pom.xml, and any other place
that has scala-2.x profiles).



On Wed, Nov 12, 2014 at 11:14 PM, Patrick Wendell  wrote:
> I actually do agree with this - let's see if we can find a solution
> that doesn't regress this behavior. Maybe we can simply move the one
> kafka example into its own project instead of having it in the
> examples project.
>
> On Wed, Nov 12, 2014 at 11:07 PM, Sandy Ryza  wrote:
>> Currently there are no mandatory profiles required to build Spark.  I.e.
>> "mvn package" just works.  It seems sad that we would need to break this.
>>
>> On Wed, Nov 12, 2014 at 10:59 PM, Patrick Wendell 
>> wrote:
>>>
>>> I think printing an error that says "-Pscala-2.10 must be enabled" is
>>> probably okay. It's a slight regression but it's super obvious to
>>> users. That could be a more elegant solution than the somewhat
>>> complicated monstrosity I proposed on the JIRA.
>>>
>>> On Wed, Nov 12, 2014 at 10:37 PM, Prashant Sharma 
>>> wrote:
>>> > One thing we can do it is print a helpful error and break. I don't know
>>> > about how this can be done, but since now I can write groovy inside
>>> > maven
>>> > build so we have more control. (Yay!!)
>>> >
>>> > Prashant Sharma
>>> >
>>> >
>>> >
>>> > On Thu, Nov 13, 2014 at 12:05 PM, Patrick Wendell 
>>> > wrote:
>>> >>
>>> >> Yeah Sandy and I were chatting about this today and din't realize
>>> >> -Pscala-2.10 was mandatory. This is a fairly invasive change, so I was
>>> >> thinking maybe we could try to remove that. Also if someone doesn't
>>> >> give -Pscala-2.10 it fails in a way that is initially silent, which is
>>> >> bad because most people won't know to do this.
>>> >>
>>> >> https://issues.apache.org/jira/browse/SPARK-4375
>>> >>
>>> >> On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma
>>> >> 
>>> >> wrote:
>>> >> > Thanks Patrick, I have one suggestion that we should make passing
>>> >> > -Pscala-2.10 mandatory for maven users. I am sorry for not mentioning
>>> >> > this
>>> >> > before. There is no way around not passing that option for maven
>>> >> > users(only). However, this is unnecessary for sbt users because it is
>>> >> > added
>>> >> > automatically if -Pscala-2.11 is absent.
>>> >> >
>>> >> >
>>> >> > Prashant Sharma
>>> >> >
>>> >> >
>>> >> >
>>> >> > On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen 
>>> >> > wrote:
>>> >> >
>>> >> >> - Tip: when you rebase, IntelliJ will temporarily think things like
>>> >> >> the
>>> >> >> Kafka module are being removed. Say 'no' when it asks if you want to
>>> >> >> remove
>>> >> >> them.
>>> >> >> - Can we go straight to Scala 2.11.4?
>>> >> >>
>>> >> >> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell
>>> >> >> 
>>> >> >> wrote:
>>> >> >>
>>> >> >> > Hey All,
>>> >> >> >
>>> >> >> > I've just merged a patch that adds support for Scala 2.11 which
>>> >> >> > will
>>> >> >> > have some minor implications for the build. These are due to the
>>> >> >> > complexities of supporting two versions of Scala in a single
>>> >> >> > project.
>>> >> >> >
>>> >> >> > 1. The JDBC server will now require a special flag to build
>>> >> >> > -Phive-thriftserver on top of the existing flag -Phive. This is
>>> >> >> > because some build permutations (only in Scala 2.11) won't support
>>> >> >> > the
>>> >> >> > JDBC server yet due to transitive dependency conflicts.
>>> >> >> >
>>> >> >> > 2. The build now uses non-standard source layouts in a few
>>> >> >> > additional
>>> >> >> > places (we already did this for the Hive project) - the repl and
>>> >> >> > the
>>> >> >> > examples modules. This is just fine for maven/sbt, but it may
>>> >> >> > affect
>>> >> >> > users who import the build in IDE's that are using these projects
>>> >> >> > and
>>> >> >> > want to build Spark from the IDE. I'm going to update our wiki to
>>> >> >> > include full instructions for making this work well in IntelliJ.
>>> >> >> >
>>> >> >> > If there

Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-12 Thread Patrick Wendell
I actually do agree with this - let's see if we can find a solution
that doesn't regress this behavior. Maybe we can simply move the one
kafka example into its own project instead of having it in the
examples project.

On Wed, Nov 12, 2014 at 11:07 PM, Sandy Ryza  wrote:
> Currently there are no mandatory profiles required to build Spark.  I.e.
> "mvn package" just works.  It seems sad that we would need to break this.
>
> On Wed, Nov 12, 2014 at 10:59 PM, Patrick Wendell 
> wrote:
>>
>> I think printing an error that says "-Pscala-2.10 must be enabled" is
>> probably okay. It's a slight regression but it's super obvious to
>> users. That could be a more elegant solution than the somewhat
>> complicated monstrosity I proposed on the JIRA.
>>
>> On Wed, Nov 12, 2014 at 10:37 PM, Prashant Sharma 
>> wrote:
>> > One thing we can do it is print a helpful error and break. I don't know
>> > about how this can be done, but since now I can write groovy inside
>> > maven
>> > build so we have more control. (Yay!!)
>> >
>> > Prashant Sharma
>> >
>> >
>> >
>> > On Thu, Nov 13, 2014 at 12:05 PM, Patrick Wendell 
>> > wrote:
>> >>
>> >> Yeah Sandy and I were chatting about this today and din't realize
>> >> -Pscala-2.10 was mandatory. This is a fairly invasive change, so I was
>> >> thinking maybe we could try to remove that. Also if someone doesn't
>> >> give -Pscala-2.10 it fails in a way that is initially silent, which is
>> >> bad because most people won't know to do this.
>> >>
>> >> https://issues.apache.org/jira/browse/SPARK-4375
>> >>
>> >> On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma
>> >> 
>> >> wrote:
>> >> > Thanks Patrick, I have one suggestion that we should make passing
>> >> > -Pscala-2.10 mandatory for maven users. I am sorry for not mentioning
>> >> > this
>> >> > before. There is no way around not passing that option for maven
>> >> > users(only). However, this is unnecessary for sbt users because it is
>> >> > added
>> >> > automatically if -Pscala-2.11 is absent.
>> >> >
>> >> >
>> >> > Prashant Sharma
>> >> >
>> >> >
>> >> >
>> >> > On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen 
>> >> > wrote:
>> >> >
>> >> >> - Tip: when you rebase, IntelliJ will temporarily think things like
>> >> >> the
>> >> >> Kafka module are being removed. Say 'no' when it asks if you want to
>> >> >> remove
>> >> >> them.
>> >> >> - Can we go straight to Scala 2.11.4?
>> >> >>
>> >> >> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell
>> >> >> 
>> >> >> wrote:
>> >> >>
>> >> >> > Hey All,
>> >> >> >
>> >> >> > I've just merged a patch that adds support for Scala 2.11 which
>> >> >> > will
>> >> >> > have some minor implications for the build. These are due to the
>> >> >> > complexities of supporting two versions of Scala in a single
>> >> >> > project.
>> >> >> >
>> >> >> > 1. The JDBC server will now require a special flag to build
>> >> >> > -Phive-thriftserver on top of the existing flag -Phive. This is
>> >> >> > because some build permutations (only in Scala 2.11) won't support
>> >> >> > the
>> >> >> > JDBC server yet due to transitive dependency conflicts.
>> >> >> >
>> >> >> > 2. The build now uses non-standard source layouts in a few
>> >> >> > additional
>> >> >> > places (we already did this for the Hive project) - the repl and
>> >> >> > the
>> >> >> > examples modules. This is just fine for maven/sbt, but it may
>> >> >> > affect
>> >> >> > users who import the build in IDE's that are using these projects
>> >> >> > and
>> >> >> > want to build Spark from the IDE. I'm going to update our wiki to
>> >> >> > include full instructions for making this work well in IntelliJ.
>> >> >> >
>> >> >> > If there are any other build related issues please respond to this
>> >> >> > thread and we'll make sure they get sorted out. Thanks to Prashant
>> >> >> > Sharma who is the author of this feature!
>> >> >> >
>> >> >> > - Patrick
>> >> >> >
>> >> >> >
>> >> >> > -
>> >> >> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> >> >> > For additional commands, e-mail: dev-h...@spark.apache.org
>> >> >> >
>> >> >> >
>> >> >>
>> >
>> >
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-12 Thread Sandy Ryza
Currently there are no mandatory profiles required to build Spark.  I.e.
"mvn package" just works.  It seems sad that we would need to break this.

On Wed, Nov 12, 2014 at 10:59 PM, Patrick Wendell 
wrote:

> I think printing an error that says "-Pscala-2.10 must be enabled" is
> probably okay. It's a slight regression but it's super obvious to
> users. That could be a more elegant solution than the somewhat
> complicated monstrosity I proposed on the JIRA.
>
> On Wed, Nov 12, 2014 at 10:37 PM, Prashant Sharma 
> wrote:
> > One thing we can do it is print a helpful error and break. I don't know
> > about how this can be done, but since now I can write groovy inside maven
> > build so we have more control. (Yay!!)
> >
> > Prashant Sharma
> >
> >
> >
> > On Thu, Nov 13, 2014 at 12:05 PM, Patrick Wendell 
> > wrote:
> >>
> >> Yeah Sandy and I were chatting about this today and din't realize
> >> -Pscala-2.10 was mandatory. This is a fairly invasive change, so I was
> >> thinking maybe we could try to remove that. Also if someone doesn't
> >> give -Pscala-2.10 it fails in a way that is initially silent, which is
> >> bad because most people won't know to do this.
> >>
> >> https://issues.apache.org/jira/browse/SPARK-4375
> >>
> >> On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma  >
> >> wrote:
> >> > Thanks Patrick, I have one suggestion that we should make passing
> >> > -Pscala-2.10 mandatory for maven users. I am sorry for not mentioning
> >> > this
> >> > before. There is no way around not passing that option for maven
> >> > users(only). However, this is unnecessary for sbt users because it is
> >> > added
> >> > automatically if -Pscala-2.11 is absent.
> >> >
> >> >
> >> > Prashant Sharma
> >> >
> >> >
> >> >
> >> > On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen 
> wrote:
> >> >
> >> >> - Tip: when you rebase, IntelliJ will temporarily think things like
> the
> >> >> Kafka module are being removed. Say 'no' when it asks if you want to
> >> >> remove
> >> >> them.
> >> >> - Can we go straight to Scala 2.11.4?
> >> >>
> >> >> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell  >
> >> >> wrote:
> >> >>
> >> >> > Hey All,
> >> >> >
> >> >> > I've just merged a patch that adds support for Scala 2.11 which
> will
> >> >> > have some minor implications for the build. These are due to the
> >> >> > complexities of supporting two versions of Scala in a single
> project.
> >> >> >
> >> >> > 1. The JDBC server will now require a special flag to build
> >> >> > -Phive-thriftserver on top of the existing flag -Phive. This is
> >> >> > because some build permutations (only in Scala 2.11) won't support
> >> >> > the
> >> >> > JDBC server yet due to transitive dependency conflicts.
> >> >> >
> >> >> > 2. The build now uses non-standard source layouts in a few
> additional
> >> >> > places (we already did this for the Hive project) - the repl and
> the
> >> >> > examples modules. This is just fine for maven/sbt, but it may
> affect
> >> >> > users who import the build in IDE's that are using these projects
> and
> >> >> > want to build Spark from the IDE. I'm going to update our wiki to
> >> >> > include full instructions for making this work well in IntelliJ.
> >> >> >
> >> >> > If there are any other build related issues please respond to this
> >> >> > thread and we'll make sure they get sorted out. Thanks to Prashant
> >> >> > Sharma who is the author of this feature!
> >> >> >
> >> >> > - Patrick
> >> >> >
> >> >> >
> -
> >> >> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >> >> > For additional commands, e-mail: dev-h...@spark.apache.org
> >> >> >
> >> >> >
> >> >>
> >
> >
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-12 Thread Patrick Wendell
I think printing an error that says "-Pscala-2.10 must be enabled" is
probably okay. It's a slight regression but it's super obvious to
users. That could be a more elegant solution than the somewhat
complicated monstrosity I proposed on the JIRA.

On Wed, Nov 12, 2014 at 10:37 PM, Prashant Sharma  wrote:
> One thing we can do it is print a helpful error and break. I don't know
> about how this can be done, but since now I can write groovy inside maven
> build so we have more control. (Yay!!)
>
> Prashant Sharma
>
>
>
> On Thu, Nov 13, 2014 at 12:05 PM, Patrick Wendell 
> wrote:
>>
>> Yeah Sandy and I were chatting about this today and din't realize
>> -Pscala-2.10 was mandatory. This is a fairly invasive change, so I was
>> thinking maybe we could try to remove that. Also if someone doesn't
>> give -Pscala-2.10 it fails in a way that is initially silent, which is
>> bad because most people won't know to do this.
>>
>> https://issues.apache.org/jira/browse/SPARK-4375
>>
>> On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma 
>> wrote:
>> > Thanks Patrick, I have one suggestion that we should make passing
>> > -Pscala-2.10 mandatory for maven users. I am sorry for not mentioning
>> > this
>> > before. There is no way around not passing that option for maven
>> > users(only). However, this is unnecessary for sbt users because it is
>> > added
>> > automatically if -Pscala-2.11 is absent.
>> >
>> >
>> > Prashant Sharma
>> >
>> >
>> >
>> > On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen  wrote:
>> >
>> >> - Tip: when you rebase, IntelliJ will temporarily think things like the
>> >> Kafka module are being removed. Say 'no' when it asks if you want to
>> >> remove
>> >> them.
>> >> - Can we go straight to Scala 2.11.4?
>> >>
>> >> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell 
>> >> wrote:
>> >>
>> >> > Hey All,
>> >> >
>> >> > I've just merged a patch that adds support for Scala 2.11 which will
>> >> > have some minor implications for the build. These are due to the
>> >> > complexities of supporting two versions of Scala in a single project.
>> >> >
>> >> > 1. The JDBC server will now require a special flag to build
>> >> > -Phive-thriftserver on top of the existing flag -Phive. This is
>> >> > because some build permutations (only in Scala 2.11) won't support
>> >> > the
>> >> > JDBC server yet due to transitive dependency conflicts.
>> >> >
>> >> > 2. The build now uses non-standard source layouts in a few additional
>> >> > places (we already did this for the Hive project) - the repl and the
>> >> > examples modules. This is just fine for maven/sbt, but it may affect
>> >> > users who import the build in IDE's that are using these projects and
>> >> > want to build Spark from the IDE. I'm going to update our wiki to
>> >> > include full instructions for making this work well in IntelliJ.
>> >> >
>> >> > If there are any other build related issues please respond to this
>> >> > thread and we'll make sure they get sorted out. Thanks to Prashant
>> >> > Sharma who is the author of this feature!
>> >> >
>> >> > - Patrick
>> >> >
>> >> > -
>> >> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> >> > For additional commands, e-mail: dev-h...@spark.apache.org
>> >> >
>> >> >
>> >>
>
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-12 Thread Prashant Sharma
One thing we can do it is print a helpful error and break. I don't know
about how this can be done, but since now I can write groovy inside maven
build so we have more control. (Yay!!)

Prashant Sharma



On Thu, Nov 13, 2014 at 12:05 PM, Patrick Wendell 
wrote:

> Yeah Sandy and I were chatting about this today and din't realize
> -Pscala-2.10 was mandatory. This is a fairly invasive change, so I was
> thinking maybe we could try to remove that. Also if someone doesn't
> give -Pscala-2.10 it fails in a way that is initially silent, which is
> bad because most people won't know to do this.
>
> https://issues.apache.org/jira/browse/SPARK-4375
>
> On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma 
> wrote:
> > Thanks Patrick, I have one suggestion that we should make passing
> > -Pscala-2.10 mandatory for maven users. I am sorry for not mentioning
> this
> > before. There is no way around not passing that option for maven
> > users(only). However, this is unnecessary for sbt users because it is
> added
> > automatically if -Pscala-2.11 is absent.
> >
> >
> > Prashant Sharma
> >
> >
> >
> > On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen  wrote:
> >
> >> - Tip: when you rebase, IntelliJ will temporarily think things like the
> >> Kafka module are being removed. Say 'no' when it asks if you want to
> remove
> >> them.
> >> - Can we go straight to Scala 2.11.4?
> >>
> >> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell 
> >> wrote:
> >>
> >> > Hey All,
> >> >
> >> > I've just merged a patch that adds support for Scala 2.11 which will
> >> > have some minor implications for the build. These are due to the
> >> > complexities of supporting two versions of Scala in a single project.
> >> >
> >> > 1. The JDBC server will now require a special flag to build
> >> > -Phive-thriftserver on top of the existing flag -Phive. This is
> >> > because some build permutations (only in Scala 2.11) won't support the
> >> > JDBC server yet due to transitive dependency conflicts.
> >> >
> >> > 2. The build now uses non-standard source layouts in a few additional
> >> > places (we already did this for the Hive project) - the repl and the
> >> > examples modules. This is just fine for maven/sbt, but it may affect
> >> > users who import the build in IDE's that are using these projects and
> >> > want to build Spark from the IDE. I'm going to update our wiki to
> >> > include full instructions for making this work well in IntelliJ.
> >> >
> >> > If there are any other build related issues please respond to this
> >> > thread and we'll make sure they get sorted out. Thanks to Prashant
> >> > Sharma who is the author of this feature!
> >> >
> >> > - Patrick
> >> >
> >> > -
> >> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >> > For additional commands, e-mail: dev-h...@spark.apache.org
> >> >
> >> >
> >>
>


Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-12 Thread Patrick Wendell
Yeah Sandy and I were chatting about this today and din't realize
-Pscala-2.10 was mandatory. This is a fairly invasive change, so I was
thinking maybe we could try to remove that. Also if someone doesn't
give -Pscala-2.10 it fails in a way that is initially silent, which is
bad because most people won't know to do this.

https://issues.apache.org/jira/browse/SPARK-4375

On Wed, Nov 12, 2014 at 10:29 PM, Prashant Sharma  wrote:
> Thanks Patrick, I have one suggestion that we should make passing
> -Pscala-2.10 mandatory for maven users. I am sorry for not mentioning this
> before. There is no way around not passing that option for maven
> users(only). However, this is unnecessary for sbt users because it is added
> automatically if -Pscala-2.11 is absent.
>
>
> Prashant Sharma
>
>
>
> On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen  wrote:
>
>> - Tip: when you rebase, IntelliJ will temporarily think things like the
>> Kafka module are being removed. Say 'no' when it asks if you want to remove
>> them.
>> - Can we go straight to Scala 2.11.4?
>>
>> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell 
>> wrote:
>>
>> > Hey All,
>> >
>> > I've just merged a patch that adds support for Scala 2.11 which will
>> > have some minor implications for the build. These are due to the
>> > complexities of supporting two versions of Scala in a single project.
>> >
>> > 1. The JDBC server will now require a special flag to build
>> > -Phive-thriftserver on top of the existing flag -Phive. This is
>> > because some build permutations (only in Scala 2.11) won't support the
>> > JDBC server yet due to transitive dependency conflicts.
>> >
>> > 2. The build now uses non-standard source layouts in a few additional
>> > places (we already did this for the Hive project) - the repl and the
>> > examples modules. This is just fine for maven/sbt, but it may affect
>> > users who import the build in IDE's that are using these projects and
>> > want to build Spark from the IDE. I'm going to update our wiki to
>> > include full instructions for making this work well in IntelliJ.
>> >
>> > If there are any other build related issues please respond to this
>> > thread and we'll make sure they get sorted out. Thanks to Prashant
>> > Sharma who is the author of this feature!
>> >
>> > - Patrick
>> >
>> > -
>> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: dev-h...@spark.apache.org
>> >
>> >
>>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-12 Thread Prashant Sharma
For scala 2.11.4, there are minor changes needed in repl code. I can do
that if that is a high priority.

Prashant Sharma



On Thu, Nov 13, 2014 at 11:59 AM, Prashant Sharma 
wrote:

> Thanks Patrick, I have one suggestion that we should make passing
> -Pscala-2.10 mandatory for maven users. I am sorry for not mentioning this
> before. There is no way around not passing that option for maven
> users(only). However, this is unnecessary for sbt users because it is added
> automatically if -Pscala-2.11 is absent.
>
>
> Prashant Sharma
>
>
>
> On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen  wrote:
>
>> - Tip: when you rebase, IntelliJ will temporarily think things like the
>> Kafka module are being removed. Say 'no' when it asks if you want to
>> remove
>> them.
>> - Can we go straight to Scala 2.11.4?
>>
>> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell 
>> wrote:
>>
>> > Hey All,
>> >
>> > I've just merged a patch that adds support for Scala 2.11 which will
>> > have some minor implications for the build. These are due to the
>> > complexities of supporting two versions of Scala in a single project.
>> >
>> > 1. The JDBC server will now require a special flag to build
>> > -Phive-thriftserver on top of the existing flag -Phive. This is
>> > because some build permutations (only in Scala 2.11) won't support the
>> > JDBC server yet due to transitive dependency conflicts.
>> >
>> > 2. The build now uses non-standard source layouts in a few additional
>> > places (we already did this for the Hive project) - the repl and the
>> > examples modules. This is just fine for maven/sbt, but it may affect
>> > users who import the build in IDE's that are using these projects and
>> > want to build Spark from the IDE. I'm going to update our wiki to
>> > include full instructions for making this work well in IntelliJ.
>> >
>> > If there are any other build related issues please respond to this
>> > thread and we'll make sure they get sorted out. Thanks to Prashant
>> > Sharma who is the author of this feature!
>> >
>> > - Patrick
>> >
>> > -
>> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: dev-h...@spark.apache.org
>> >
>> >
>>
>
>


Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-12 Thread Prashant Sharma
Thanks Patrick, I have one suggestion that we should make passing
-Pscala-2.10 mandatory for maven users. I am sorry for not mentioning this
before. There is no way around not passing that option for maven
users(only). However, this is unnecessary for sbt users because it is added
automatically if -Pscala-2.11 is absent.


Prashant Sharma



On Wed, Nov 12, 2014 at 3:53 PM, Sean Owen  wrote:

> - Tip: when you rebase, IntelliJ will temporarily think things like the
> Kafka module are being removed. Say 'no' when it asks if you want to remove
> them.
> - Can we go straight to Scala 2.11.4?
>
> On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell 
> wrote:
>
> > Hey All,
> >
> > I've just merged a patch that adds support for Scala 2.11 which will
> > have some minor implications for the build. These are due to the
> > complexities of supporting two versions of Scala in a single project.
> >
> > 1. The JDBC server will now require a special flag to build
> > -Phive-thriftserver on top of the existing flag -Phive. This is
> > because some build permutations (only in Scala 2.11) won't support the
> > JDBC server yet due to transitive dependency conflicts.
> >
> > 2. The build now uses non-standard source layouts in a few additional
> > places (we already did this for the Hive project) - the repl and the
> > examples modules. This is just fine for maven/sbt, but it may affect
> > users who import the build in IDE's that are using these projects and
> > want to build Spark from the IDE. I'm going to update our wiki to
> > include full instructions for making this work well in IntelliJ.
> >
> > If there are any other build related issues please respond to this
> > thread and we'll make sure they get sorted out. Thanks to Prashant
> > Sharma who is the author of this feature!
> >
> > - Patrick
> >
> > -
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
> >
>


Re: [NOTICE] [BUILD] Minor changes to Spark's build

2014-11-12 Thread Sean Owen
- Tip: when you rebase, IntelliJ will temporarily think things like the
Kafka module are being removed. Say 'no' when it asks if you want to remove
them.
- Can we go straight to Scala 2.11.4?

On Wed, Nov 12, 2014 at 5:47 AM, Patrick Wendell  wrote:

> Hey All,
>
> I've just merged a patch that adds support for Scala 2.11 which will
> have some minor implications for the build. These are due to the
> complexities of supporting two versions of Scala in a single project.
>
> 1. The JDBC server will now require a special flag to build
> -Phive-thriftserver on top of the existing flag -Phive. This is
> because some build permutations (only in Scala 2.11) won't support the
> JDBC server yet due to transitive dependency conflicts.
>
> 2. The build now uses non-standard source layouts in a few additional
> places (we already did this for the Hive project) - the repl and the
> examples modules. This is just fine for maven/sbt, but it may affect
> users who import the build in IDE's that are using these projects and
> want to build Spark from the IDE. I'm going to update our wiki to
> include full instructions for making this work well in IntelliJ.
>
> If there are any other build related issues please respond to this
> thread and we'll make sure they get sorted out. Thanks to Prashant
> Sharma who is the author of this feature!
>
> - Patrick
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


[NOTICE] [BUILD] Minor changes to Spark's build

2014-11-11 Thread Patrick Wendell
Hey All,

I've just merged a patch that adds support for Scala 2.11 which will
have some minor implications for the build. These are due to the
complexities of supporting two versions of Scala in a single project.

1. The JDBC server will now require a special flag to build
-Phive-thriftserver on top of the existing flag -Phive. This is
because some build permutations (only in Scala 2.11) won't support the
JDBC server yet due to transitive dependency conflicts.

2. The build now uses non-standard source layouts in a few additional
places (we already did this for the Hive project) - the repl and the
examples modules. This is just fine for maven/sbt, but it may affect
users who import the build in IDE's that are using these projects and
want to build Spark from the IDE. I'm going to update our wiki to
include full instructions for making this work well in IntelliJ.

If there are any other build related issues please respond to this
thread and we'll make sure they get sorted out. Thanks to Prashant
Sharma who is the author of this feature!

- Patrick

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org