Re: SBT / PR builder builds failing on "include an external JAR in SparkR"

2017-06-12 Thread Felix Cheung
Facepalm

I broken them - I was making changes to test files and of course Jenkins was 
only running only R tests since I was only changing R files, and everything 
passed there.

Fix is
Seq(sparkHome, "R", "pkg", "inst", "tests",

To
Seq(sparkHome, "R", "pkg", "tests", "fulltests",

And 2 instances of this.

I'm AFK right now and will push a fix as soon as I can. Sorry for the miss.

_
From: Sean Owen >
Sent: Monday, June 12, 2017 5:56 AM
Subject: SBT / PR builder builds failing on "include an external JAR in SparkR"
To: dev >


I noticed the PR builder builds are all failing with:

[info] - correctly builds R packages included in a jar with --packages !!! 
IGNORED !!!
[info] - include an external JAR in SparkR *** FAILED *** (32 milliseconds)
[info]   new java.io.File(rScriptDir).exists() was false 
(SparkSubmitSuite.scala:531)
[info]   org.scalatest.exceptions.TestFailedException:
[info]   at 
org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500)
[info]   at 
org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555)
[info]   at 
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:466)
[info]   at 
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$23.apply$mcV$sp(SparkSubmitSuite.scala:531)
...

It seems to only affect the SBT builds; the Maven builds show this test is 
cancelled because R isn't installed:

- correctly builds R packages included in a jar with --packages !!! IGNORED !!!
- include an external JAR in SparkR !!! CANCELED !!!
  org.apache.spark.api.r.RUtils.isSparkRInstalled was false SparkR is not 
installed in this build. (SparkSubmitSuite.scala:528)

It seems to have started after:

https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/3081/

but I don't see how those changes relate.

Did anything happen to chane w.r.t. R tests or the env in the last day?





SBT / PR builder builds failing on "include an external JAR in SparkR"

2017-06-12 Thread Sean Owen
I noticed the PR builder builds are all failing with:

[info] - correctly builds R packages included in a jar with --packages !!!
IGNORED !!!
[info] - include an external JAR in SparkR *** FAILED *** (32 milliseconds)
[info]   new java.io.File(rScriptDir).exists() was false
(SparkSubmitSuite.scala:531)
[info]   org.scalatest.exceptions.TestFailedException:
[info]   at
org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500)
[info]   at
org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555)
[info]   at
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:466)
[info]   at
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$23.apply$mcV$sp(SparkSubmitSuite.scala:531)
...

It seems to only affect the SBT builds; the Maven builds show this test is
cancelled because R isn't installed:

- correctly builds R packages included in a jar with --packages !!! IGNORED
!!!
- include an external JAR in SparkR !!! CANCELED !!!
  org.apache.spark.api.r.RUtils.isSparkRInstalled was false SparkR is not
installed in this build. (SparkSubmitSuite.scala:528)

It seems to have started after:

https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/3081/

but I don't see how those changes relate.

Did anything happen to chane w.r.t. R tests or the env in the last day?