Pardon. During earlier test run, I got: ^[[32mStreamingContextSuite:^[[0m ^[[32m- from no conf constructor^[[0m ^[[32m- from no conf + spark home^[[0m ^[[32m- from no conf + spark home + env^[[0m ^[[32m- from conf with settings^[[0m ^[[32m- from existing SparkContext^[[0m ^[[32m- from existing SparkContext with settings^[[0m ^[[31m*** RUN ABORTED ***^[[0m ^[[31m java.lang.NoSuchMethodError: org.apache.spark.ui.JettyUtils$.createStaticHandler(Ljava/lang/String;Ljava/lang/String;)Lorg/eclipse/jetty/servlet/ServletContextHandler;^[[0m ^[[31m at org.apache.spark.streaming.ui.StreamingTab.attach(StreamingTab.scala:49)^[[0m ^[[31m at org.apache.spark.streaming.StreamingContext$$anonfun$start$2.apply(StreamingContext.scala:601)^[[0m ^[[31m at org.apache.spark.streaming.StreamingContext$$anonfun$start$2.apply(StreamingContext.scala:601)^[[0m ^[[31m at scala.Option.foreach(Option.scala:236)^[[0m ^[[31m at org.apache.spark.streaming.StreamingContext.start(StreamingContext.scala:601)^[[0m ^[[31m at org.apache.spark.streaming.StreamingContextSuite$$anonfun$8.apply$mcV$sp(StreamingContextSuite.scala:101)^[[0m ^[[31m at org.apache.spark.streaming.StreamingContextSuite$$anonfun$8.apply(StreamingContextSuite.scala:96)^[[0m ^[[31m at org.apache.spark.streaming.StreamingContextSuite$$anonfun$8.apply(StreamingContextSuite.scala:96)^[[0m ^[[31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)^[[0m ^[[31m at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)^[[0m
The error from previous email was due to absence of StreamingContextSuite.scala On Fri, Jun 26, 2015 at 1:27 PM, Ted Yu <yuzhih...@gmail.com> wrote: > I got the following when running test suite: > > [INFO] compiler plugin: > BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null) > ^[[0m[^[[0minfo^[[0m] ^[[0mCompiling 2 Scala sources and 1 Java source to > /home/hbase/spark-1.4.1/streaming/target/scala-2.10/test-classes...^[[0m > ^[[0m[^[[31merror^[[0m] > ^[[0m/home/hbase/spark-1.4.1/streaming/src/test/scala/org/apache/spark/streaming/DStreamClosureSuite.scala:82: > not found: type TestException^[[0m > ^[[0m[^[[31merror^[[0m] ^[[0m throw new TestException(^[[0m > ^[[0m[^[[31merror^[[0m] ^[[0m ^^[[0m > ^[[0m[^[[31merror^[[0m] > ^[[0m/home/hbase/spark-1.4.1/streaming/src/test/scala/org/apache/spark/streaming/scheduler/JobGeneratorSuite.scala:73: > not found: type TestReceiver^[[0m > ^[[0m[^[[31merror^[[0m] ^[[0m val inputStream = > ssc.receiverStream(new TestReceiver)^[[0m > ^[[0m[^[[31merror^[[0m] ^[[0m > ^^[[0m > ^[[0m[^[[31merror^[[0m] ^[[0mtwo errors found^[[0m > ^[[0m[^[[31merror^[[0m] ^[[0mCompile failed at Jun 25, 2015 5:12:24 PM > [1.492s]^[[0m > > Has anyone else seen similar error ? > > Thanks > > On Tue, Jun 23, 2015 at 10:37 PM, Patrick Wendell <pwend...@gmail.com> > wrote: > >> Please vote on releasing the following candidate as Apache Spark version >> 1.4.1! >> >> This release fixes a handful of known issues in Spark 1.4.0, listed here: >> http://s.apache.org/spark-1.4.1 >> >> The tag to be voted on is v1.4.1-rc1 (commit 60e08e5): >> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h= >> 60e08e50751fe3929156de956d62faea79f5b801 >> >> The release files, including signatures, digests, etc. can be found at: >> http://people.apache.org/~pwendell/spark-releases/spark-1.4.1-rc1-bin/ >> >> Release artifacts are signed with the following key: >> https://people.apache.org/keys/committer/pwendell.asc >> >> The staging repository for this release can be found at: >> [published as version: 1.4.1] >> https://repository.apache.org/content/repositories/orgapachespark-1118/ >> [published as version: 1.4.1-rc1] >> https://repository.apache.org/content/repositories/orgapachespark-1119/ >> >> The documentation corresponding to this release can be found at: >> http://people.apache.org/~pwendell/spark-releases/spark-1.4.1-rc1-docs/ >> >> Please vote on releasing this package as Apache Spark 1.4.1! >> >> The vote is open until Saturday, June 27, at 06:32 UTC and passes >> if a majority of at least 3 +1 PMC votes are cast. >> >> [ ] +1 Release this package as Apache Spark 1.4.1 >> [ ] -1 Do not release this package because ... >> >> To learn more about Apache Spark, please see >> http://spark.apache.org/ >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org >> For additional commands, e-mail: dev-h...@spark.apache.org >> >> >