Sorry for not being clear, yes, that's about the Sbt build and treating warnings as errors.
Warnings in 2.11 are useful, though, it'd be a pity to keep introducing potential issues. As a stop-gap measure I can disable them in the Sbt build, is it hard to run the CI test with 2.11/sbt? iulian On Thu, Oct 8, 2015 at 7:24 PM, Reynold Xin <r...@databricks.com> wrote: > The problem only applies to the sbt build because it treats warnings as > errors. > > @Iulian - how about we disable warnings -> errors for 2.11? That would > seem better until we switch 2.11 to be the default build. > > > On Thu, Oct 8, 2015 at 7:55 AM, Ted Yu <yuzhih...@gmail.com> wrote: > >> I tried building with Scala 2.11 on Linux with latest master branch : >> >> [INFO] Spark Project External MQTT ........................ SUCCESS [ >> 19.188 s] >> [INFO] Spark Project External MQTT Assembly ............... SUCCESS [ >> 7.081 s] >> [INFO] Spark Project External ZeroMQ ...................... SUCCESS [ >> 8.790 s] >> [INFO] Spark Project External Kafka ....................... SUCCESS [ >> 14.764 s] >> [INFO] Spark Project Examples ............................. SUCCESS >> [02:22 min] >> [INFO] Spark Project External Kafka Assembly .............. SUCCESS [ >> 10.286 s] >> [INFO] >> ------------------------------------------------------------------------ >> [INFO] BUILD SUCCESS >> [INFO] >> ------------------------------------------------------------------------ >> [INFO] Total time: 17:49 min >> >> FYI >> >> On Thu, Oct 8, 2015 at 6:50 AM, Ted Yu <yuzhih...@gmail.com> wrote: >> >>> Interesting >>> >>> >>> https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-Master-Scala211-Compile/ >>> shows green builds. >>> >>> >>> On Thu, Oct 8, 2015 at 6:40 AM, Iulian Dragoș < >>> iulian.dra...@typesafe.com> wrote: >>> >>>> Since Oct. 4 the build fails on 2.11 with the dreaded >>>> >>>> [error] /home/ubuntu/workspace/Apache Spark (master) on >>>> 2.11/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:310: >>>> no valid targets for annotation on value conf - it is discarded unused. >>>> You may specify targets with meta-annotations, e.g. @(transient @param) >>>> [error] private[netty] class NettyRpcEndpointRef(@transient conf: >>>> SparkConf) >>>> >>>> Can we have the pull request builder at least build with 2.11? This >>>> makes #8433 <https://github.com/apache/spark/pull/8433> pretty much >>>> useless, since people will continue to add useless @transient annotations. >>>> >>>> -- >>>> >>>> -- >>>> Iulian Dragos >>>> >>>> ------ >>>> Reactive Apps on the JVM >>>> www.typesafe.com >>>> >>>> >>> >> > -- -- Iulian Dragos ------ Reactive Apps on the JVM www.typesafe.com