[ https://issues.apache.org/jira/browse/SPARK-11110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14957984#comment-14957984 ]
Apache Spark commented on SPARK-11110: -------------------------------------- User 'jodersky' has created a pull request for this issue: https://github.com/apache/spark/pull/9126 > Scala 2.11 build fails due to compiler errors > --------------------------------------------- > > Key: SPARK-11110 > URL: https://issues.apache.org/jira/browse/SPARK-11110 > Project: Spark > Issue Type: Bug > Components: Build > Reporter: Patrick Wendell > Assignee: Jakob Odersky > Priority: Critical > > Right now the 2.11 build is failing due to compiler errors in SBT (though not > in Maven). I have updated our 2.11 compile test harness to catch this. > https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-Master-Scala211-Compile/1667/consoleFull > {code} > [error] > /home/jenkins/workspace/Spark-Master-Scala211-Compile/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308: > no valid targets for annotation on value conf - it is discarded unused. You > may specify targets with meta-annotations, e.g. @(transient @param) > [error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf) > [error] > {code} > This is one error, but there may be others past this point (the compile fails > fast). -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org