Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-12 Thread Sean Owen
Yeah, was the issue that it had to be built vs Maven to show the error
and this uses SBT -- or vice versa? that's why the existing test
didn't detect it. Was just thinking of adding one more of these non-PR
builds, but I forget if there was a reason this is hard. Certainly not
worth building for each PR.

On Mon, Oct 12, 2015 at 5:16 PM, Patrick Wendell  wrote:
> We already do automated compile testing for Scala 2.11 similar to Hadoop
> versions:
>
> https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/
> https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-master-Scala211-Compile/buildTimeTrend
>
>
> If you look, this build takes 7-10 minutes, so it's a nontrivial increase to
> add it to all new PR's. Also, it's only broken once in the last few months
> (despite many patches going in) - a pretty low failure rate. For scenarios
> like this it's better to test it asynchronously. We can even just revert a
> patch immediately if it's found to break 2.11.
>
> Put another way - we typically have 1000 patches or more per release. Even
> at one jenkins run per patch: 7 minutes * 1000 = 7 days of developer
> productivity loss. Compare that to having a few times where we have to
> revert a patch and ask someone to resubmit (which maybe takes at most one
> hour)... it's not worth it.
>
> - Patrick
>
> On Mon, Oct 12, 2015 at 8:24 AM, Sean Owen  wrote:
>>
>> There are many Jenkins jobs besides the pull request builder that
>> build against various Hadoop combinations, for example, in the
>> background. Is there an obstacle to building vs 2.11 on both Maven and
>> SBT this way?
>>
>> On Mon, Oct 12, 2015 at 2:55 PM, Iulian Dragoș
>>  wrote:
>> > Anything that can be done by a machine should be done by a machine. I am
>> > not
>> > sure we have enough data to say it's only once or twice per release, and
>> > even if we were to issue a PR for each breakage, it's additional load on
>> > committers and reviewers, not to mention our own work. I personally
>> > don't
>> > see how 2-3 minutes of compute time per PR can justify hours of work
>> > plus
>> > reviews.
>
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-12 Thread Sean Owen
There are many Jenkins jobs besides the pull request builder that
build against various Hadoop combinations, for example, in the
background. Is there an obstacle to building vs 2.11 on both Maven and
SBT this way?

On Mon, Oct 12, 2015 at 2:55 PM, Iulian Dragoș
 wrote:
> Anything that can be done by a machine should be done by a machine. I am not
> sure we have enough data to say it's only once or twice per release, and
> even if we were to issue a PR for each breakage, it's additional load on
> committers and reviewers, not to mention our own work. I personally don't
> see how 2-3 minutes of compute time per PR can justify hours of work plus
> reviews.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-12 Thread Patrick Wendell
It's really easy to create and modify those builds. If the issue is that we
need to add SBT or Maven to the existing one, it's a short change. We can
just have it build both of them. I wasn't aware of things breaking before
in one build but not another.

- Patrick

On Mon, Oct 12, 2015 at 9:21 AM, Sean Owen  wrote:

> Yeah, was the issue that it had to be built vs Maven to show the error
> and this uses SBT -- or vice versa? that's why the existing test
> didn't detect it. Was just thinking of adding one more of these non-PR
> builds, but I forget if there was a reason this is hard. Certainly not
> worth building for each PR.
>
> On Mon, Oct 12, 2015 at 5:16 PM, Patrick Wendell 
> wrote:
> > We already do automated compile testing for Scala 2.11 similar to Hadoop
> > versions:
> >
> > https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-master-Scala211-Compile/buildTimeTrend
> >
> >
> > If you look, this build takes 7-10 minutes, so it's a nontrivial
> increase to
> > add it to all new PR's. Also, it's only broken once in the last few
> months
> > (despite many patches going in) - a pretty low failure rate. For
> scenarios
> > like this it's better to test it asynchronously. We can even just revert
> a
> > patch immediately if it's found to break 2.11.
> >
> > Put another way - we typically have 1000 patches or more per release.
> Even
> > at one jenkins run per patch: 7 minutes * 1000 = 7 days of developer
> > productivity loss. Compare that to having a few times where we have to
> > revert a patch and ask someone to resubmit (which maybe takes at most one
> > hour)... it's not worth it.
> >
> > - Patrick
> >
> > On Mon, Oct 12, 2015 at 8:24 AM, Sean Owen  wrote:
> >>
> >> There are many Jenkins jobs besides the pull request builder that
> >> build against various Hadoop combinations, for example, in the
> >> background. Is there an obstacle to building vs 2.11 on both Maven and
> >> SBT this way?
> >>
> >> On Mon, Oct 12, 2015 at 2:55 PM, Iulian Dragoș
> >>  wrote:
> >> > Anything that can be done by a machine should be done by a machine. I
> am
> >> > not
> >> > sure we have enough data to say it's only once or twice per release,
> and
> >> > even if we were to issue a PR for each breakage, it's additional load
> on
> >> > committers and reviewers, not to mention our own work. I personally
> >> > don't
> >> > see how 2-3 minutes of compute time per PR can justify hours of work
> >> > plus
> >> > reviews.
> >
> >
>


Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-12 Thread Iulian Dragoș
On Fri, Oct 9, 2015 at 10:34 PM, Patrick Wendell  wrote:

> I would push back slightly. The reason we have the PR builds taking so
> long is death by a million small things that we add. Doing a full 2.11
> compile is order minutes... it's a nontrivial increase to the build times.
>

We can host the build if there's a way to post back a comment when the
build is broken.


>
> It doesn't seem that bad to me to go back post-hoc once in a while and fix
> 2.11 bugs when they come up. It's on the order of once or twice per release
> and the typesafe guys keep a close eye on it (thanks!). Compare that to
> literally thousands of PR runs and a few minutes every time, IMO it's not
> worth it.
>

Anything that can be done by a machine should be done by a machine. I am
not sure we have enough data to say it's only once or twice per release,
and even if we were to issue a PR for each breakage, it's additional load
on committers and reviewers, not to mention our own work. I personally
don't see how 2-3 minutes of compute time per PR can justify hours of work
plus reviews.

iulian


>
> On Fri, Oct 9, 2015 at 3:31 PM, Hari Shreedharan <
> hshreedha...@cloudera.com> wrote:
>
>> +1, much better than having a new PR each time to fix something for
>> scala-2.11 every time a patch breaks it.
>>
>> Thanks,
>> Hari Shreedharan
>>
>>
>>
>>
>> On Oct 9, 2015, at 11:47 AM, Michael Armbrust 
>> wrote:
>>
>> How about just fixing the warning? I get it; it doesn't stop this from
>>> happening again, but still seems less drastic than tossing out the
>>> whole mechanism.
>>>
>>
>> +1
>>
>> It also does not seem that expensive to test only compilation for Scala
>> 2.11 on PR builds.
>>
>>
>>
>


-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-09 Thread Patrick Wendell
I would push back slightly. The reason we have the PR builds taking so long
is death by a million small things that we add. Doing a full 2.11 compile
is order minutes... it's a nontrivial increase to the build times.

It doesn't seem that bad to me to go back post-hoc once in a while and fix
2.11 bugs when they come up. It's on the order of once or twice per release
and the typesafe guys keep a close eye on it (thanks!). Compare that to
literally thousands of PR runs and a few minutes every time, IMO it's not
worth it.

On Fri, Oct 9, 2015 at 3:31 PM, Hari Shreedharan 
wrote:

> +1, much better than having a new PR each time to fix something for
> scala-2.11 every time a patch breaks it.
>
> Thanks,
> Hari Shreedharan
>
>
>
>
> On Oct 9, 2015, at 11:47 AM, Michael Armbrust 
> wrote:
>
> How about just fixing the warning? I get it; it doesn't stop this from
>> happening again, but still seems less drastic than tossing out the
>> whole mechanism.
>>
>
> +1
>
> It also does not seem that expensive to test only compilation for Scala
> 2.11 on PR builds.
>
>
>


Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-09 Thread Prashant Sharma
That is correct !, I have thought about this a lot of times. The only
solution is to implement a "real" cross build for both version. I am going
to think more in this. :)

Prashant Sharma



On Sat, Oct 10, 2015 at 2:04 AM, Patrick Wendell  wrote:

> I would push back slightly. The reason we have the PR builds taking so
> long is death by a million small things that we add. Doing a full 2.11
> compile is order minutes... it's a nontrivial increase to the build times.
>
> It doesn't seem that bad to me to go back post-hoc once in a while and fix
> 2.11 bugs when they come up. It's on the order of once or twice per release
> and the typesafe guys keep a close eye on it (thanks!). Compare that to
> literally thousands of PR runs and a few minutes every time, IMO it's not
> worth it.
>
> On Fri, Oct 9, 2015 at 3:31 PM, Hari Shreedharan <
> hshreedha...@cloudera.com> wrote:
>
>> +1, much better than having a new PR each time to fix something for
>> scala-2.11 every time a patch breaks it.
>>
>> Thanks,
>> Hari Shreedharan
>>
>>
>>
>>
>> On Oct 9, 2015, at 11:47 AM, Michael Armbrust 
>> wrote:
>>
>> How about just fixing the warning? I get it; it doesn't stop this from
>>> happening again, but still seems less drastic than tossing out the
>>> whole mechanism.
>>>
>>
>> +1
>>
>> It also does not seem that expensive to test only compilation for Scala
>> 2.11 on PR builds.
>>
>>
>>
>


Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-09 Thread Iulian Dragoș
Sorry for not being clear, yes, that's about the Sbt build and treating
warnings as errors.

Warnings in 2.11 are useful, though, it'd be a pity to keep introducing
potential issues. As a stop-gap measure I can disable them in the Sbt
build, is it hard to run the CI test with 2.11/sbt?

iulian


On Thu, Oct 8, 2015 at 7:24 PM, Reynold Xin  wrote:

> The problem only applies to the sbt build because it treats warnings as
> errors.
>
> @Iulian - how about we disable warnings -> errors for 2.11? That would
> seem better until we switch 2.11 to be the default build.
>
>
> On Thu, Oct 8, 2015 at 7:55 AM, Ted Yu  wrote:
>
>> I tried building with Scala 2.11 on Linux with latest master branch :
>>
>> [INFO] Spark Project External MQTT  SUCCESS [
>> 19.188 s]
>> [INFO] Spark Project External MQTT Assembly ... SUCCESS [
>>  7.081 s]
>> [INFO] Spark Project External ZeroMQ .. SUCCESS [
>>  8.790 s]
>> [INFO] Spark Project External Kafka ... SUCCESS [
>> 14.764 s]
>> [INFO] Spark Project Examples . SUCCESS
>> [02:22 min]
>> [INFO] Spark Project External Kafka Assembly .. SUCCESS [
>> 10.286 s]
>> [INFO]
>> 
>> [INFO] BUILD SUCCESS
>> [INFO]
>> 
>> [INFO] Total time: 17:49 min
>>
>> FYI
>>
>> On Thu, Oct 8, 2015 at 6:50 AM, Ted Yu  wrote:
>>
>>> Interesting
>>>
>>>
>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-Master-Scala211-Compile/
>>> shows green builds.
>>>
>>>
>>> On Thu, Oct 8, 2015 at 6:40 AM, Iulian Dragoș <
>>> iulian.dra...@typesafe.com> wrote:
>>>
 Since Oct. 4 the build fails on 2.11 with the dreaded

 [error] /home/ubuntu/workspace/Apache Spark (master) on 
 2.11/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:310: 
 no valid targets for annotation on value conf - it is discarded unused. 
 You may specify targets with meta-annotations, e.g. @(transient @param)
 [error] private[netty] class NettyRpcEndpointRef(@transient conf: 
 SparkConf)

 Can we have the pull request builder at least build with 2.11? This
 makes #8433  pretty much
 useless, since people will continue to add useless @transient annotations.
 ​
 --

 --
 Iulian Dragos

 --
 Reactive Apps on the JVM
 www.typesafe.com


>>>
>>
>


-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-09 Thread Sean Owen
How about just fixing the warning? I get it; it doesn't stop this from
happening again, but still seems less drastic than tossing out the
whole mechanism.

On Fri, Oct 9, 2015 at 3:18 PM, Iulian Dragoș
 wrote:
> Sorry for not being clear, yes, that's about the Sbt build and treating
> warnings as errors.
>
> Warnings in 2.11 are useful, though, it'd be a pity to keep introducing
> potential issues. As a stop-gap measure I can disable them in the Sbt build,
> is it hard to run the CI test with 2.11/sbt?
>
> iulian
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-09 Thread Michael Armbrust
>
> How about just fixing the warning? I get it; it doesn't stop this from
> happening again, but still seems less drastic than tossing out the
> whole mechanism.
>

+1

It also does not seem that expensive to test only compilation for Scala
2.11 on PR builds.


Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-09 Thread Hari Shreedharan
+1, much better than having a new PR each time to fix something for scala-2.11 
every time a patch breaks it.

Thanks,
Hari Shreedharan




> On Oct 9, 2015, at 11:47 AM, Michael Armbrust  wrote:
> 
> How about just fixing the warning? I get it; it doesn't stop this from
> happening again, but still seems less drastic than tossing out the
> whole mechanism.
> 
> +1
> 
> It also does not seem that expensive to test only compilation for Scala 2.11 
> on PR builds. 



Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-08 Thread Ted Yu
Interesting

https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-Master-Scala211-Compile/
shows green builds.


On Thu, Oct 8, 2015 at 6:40 AM, Iulian Dragoș 
wrote:

> Since Oct. 4 the build fails on 2.11 with the dreaded
>
> [error] /home/ubuntu/workspace/Apache Spark (master) on 
> 2.11/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:310: no 
> valid targets for annotation on value conf - it is discarded unused. You may 
> specify targets with meta-annotations, e.g. @(transient @param)
> [error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
>
> Can we have the pull request builder at least build with 2.11? This makes
> #8433  pretty much useless,
> since people will continue to add useless @transient annotations.
> ​
> --
>
> --
> Iulian Dragos
>
> --
> Reactive Apps on the JVM
> www.typesafe.com
>
>


Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-08 Thread Ted Yu
I tried building with Scala 2.11 on Linux with latest master branch :

[INFO] Spark Project External MQTT  SUCCESS [
19.188 s]
[INFO] Spark Project External MQTT Assembly ... SUCCESS [
 7.081 s]
[INFO] Spark Project External ZeroMQ .. SUCCESS [
 8.790 s]
[INFO] Spark Project External Kafka ... SUCCESS [
14.764 s]
[INFO] Spark Project Examples . SUCCESS [02:22
min]
[INFO] Spark Project External Kafka Assembly .. SUCCESS [
10.286 s]
[INFO]

[INFO] BUILD SUCCESS
[INFO]

[INFO] Total time: 17:49 min

FYI

On Thu, Oct 8, 2015 at 6:50 AM, Ted Yu  wrote:

> Interesting
>
>
> https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-Master-Scala211-Compile/
> shows green builds.
>
>
> On Thu, Oct 8, 2015 at 6:40 AM, Iulian Dragoș 
> wrote:
>
>> Since Oct. 4 the build fails on 2.11 with the dreaded
>>
>> [error] /home/ubuntu/workspace/Apache Spark (master) on 
>> 2.11/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:310: 
>> no valid targets for annotation on value conf - it is discarded unused. You 
>> may specify targets with meta-annotations, e.g. @(transient @param)
>> [error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
>>
>> Can we have the pull request builder at least build with 2.11? This makes
>> #8433  pretty much useless,
>> since people will continue to add useless @transient annotations.
>> ​
>> --
>>
>> --
>> Iulian Dragos
>>
>> --
>> Reactive Apps on the JVM
>> www.typesafe.com
>>
>>
>


Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-08 Thread Reynold Xin
The problem only applies to the sbt build because it treats warnings as
errors.

@Iulian - how about we disable warnings -> errors for 2.11? That would seem
better until we switch 2.11 to be the default build.


On Thu, Oct 8, 2015 at 7:55 AM, Ted Yu  wrote:

> I tried building with Scala 2.11 on Linux with latest master branch :
>
> [INFO] Spark Project External MQTT  SUCCESS [
> 19.188 s]
> [INFO] Spark Project External MQTT Assembly ... SUCCESS [
>  7.081 s]
> [INFO] Spark Project External ZeroMQ .. SUCCESS [
>  8.790 s]
> [INFO] Spark Project External Kafka ... SUCCESS [
> 14.764 s]
> [INFO] Spark Project Examples . SUCCESS [02:22
> min]
> [INFO] Spark Project External Kafka Assembly .. SUCCESS [
> 10.286 s]
> [INFO]
> 
> [INFO] BUILD SUCCESS
> [INFO]
> 
> [INFO] Total time: 17:49 min
>
> FYI
>
> On Thu, Oct 8, 2015 at 6:50 AM, Ted Yu  wrote:
>
>> Interesting
>>
>>
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-Master-Scala211-Compile/
>> shows green builds.
>>
>>
>> On Thu, Oct 8, 2015 at 6:40 AM, Iulian Dragoș > > wrote:
>>
>>> Since Oct. 4 the build fails on 2.11 with the dreaded
>>>
>>> [error] /home/ubuntu/workspace/Apache Spark (master) on 
>>> 2.11/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:310: 
>>> no valid targets for annotation on value conf - it is discarded unused. You 
>>> may specify targets with meta-annotations, e.g. @(transient @param)
>>> [error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
>>>
>>> Can we have the pull request builder at least build with 2.11? This
>>> makes #8433  pretty much
>>> useless, since people will continue to add useless @transient annotations.
>>> ​
>>> --
>>>
>>> --
>>> Iulian Dragos
>>>
>>> --
>>> Reactive Apps on the JVM
>>> www.typesafe.com
>>>
>>>
>>
>