See <https://builds.apache.org/job/Geode-spark-connector/80/>
------------------------------------------ [...truncated 1883 lines...] 16/09/26 15:21:01 INFO HttpFileServer: HTTP File server directory is /tmp/spark-7552e30c-bb8f-45c7-be85-d259f3b7fd0c/httpd-6a47a70a-a5c7-4753-a07d-ed3f38c82625 16/09/26 15:21:01 INFO HttpServer: Starting HTTP Server 16/09/26 15:21:01 INFO Utils: Successfully started service 'HTTP file server' on port 36213. 16/09/26 15:21:01 INFO SparkEnv: Registering OutputCommitCoordinator 16/09/26 15:21:06 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 16/09/26 15:21:11 INFO Utils: Successfully started service 'SparkUI' on port 4041. 16/09/26 15:21:11 INFO SparkUI: Started SparkUI at http://localhost:4041 16/09/26 15:21:11 INFO Executor: Starting executor ID <driver> on host localhost 16/09/26 15:21:11 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:46093/user/HeartbeatReceiver 16/09/26 15:21:11 INFO NettyBlockTransferService: Server created on 46091 16/09/26 15:21:11 INFO BlockManagerMaster: Trying to register BlockManager 16/09/26 15:21:11 INFO BlockManagerMasterActor: Registering block manager localhost:46091 with 2.8 GB RAM, BlockManagerId(<driver>, localhost, 46091) 16/09/26 15:21:11 INFO BlockManagerMaster: Registered BlockManager [0m[[0minfo[0m] [0m[32mRetrieveRegionIntegrationTest:[0m[0m === GeodeRunner: stop server 1. === GeodeRunner: stop server 2. ...... === GeodeRunner: stop locator ... Successfully stop Geode locator at port 20287. === GeodeRunner: starting locator on port 24079 === GeodeRunner: waiting for locator on port 24079 ....=== GeodeRunner: done waiting for locator on port 24079 === GeodeRunner: starting server1 with clientPort 29698 === GeodeRunner: starting server2 with clientPort 21056 === GeodeRunner: starting server3 with clientPort 29845 === GeodeRunner: starting server4 with clientPort 26222 ... ............................................Locator in /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/locator on hemera.apache.org[24079] as locator is currently online. Process ID: 30902 Uptime: 4 seconds GemFire Version: 1.0.0-incubating-SNAPSHOT Java Version: 1.8.0_66 Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/locator/locator.log JVM Arguments: -Dgemfire.enable-cluster-configuration=true -Dgemfire.load-cluster-configuration-from-dir=false -Dgemfire.jmx-manager-http-port=24854 -Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true -Dsun.rmi.dgc.server.gcInterval=9223372036854775806 Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar Successfully connected to: JMX Manager [host=hemera.apache.org, port=1099] Cluster configuration service is up and running. .................... Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server4 on hemera.apache.org[26222] as server4 is currently online. Process ID: 31234 Uptime: 9 seconds GemFire Version: 1.0.0-incubating-SNAPSHOT Java Version: 1.8.0_66 Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server4/server4.log JVM Arguments: -Dgemfire.locators=localhost[24079] -Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-retrieve-regions.xml -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true -Dsun.rmi.dgc.server.gcInterval=9223372036854775806 Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server1 on hemera.apache.org[29698] as server1 is currently online. Process ID: 31267 Uptime: 9 seconds GemFire Version: 1.0.0-incubating-SNAPSHOT Java Version: 1.8.0_66 Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server1/server1.log JVM Arguments: -Dgemfire.locators=localhost[24079] -Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-retrieve-regions.xml -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true -Dsun.rmi.dgc.server.gcInterval=9223372036854775806 Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server2 on hemera.apache.org[21056] as server2 is currently online. Process ID: 31302 Uptime: 9 seconds GemFire Version: 1.0.0-incubating-SNAPSHOT Java Version: 1.8.0_66 Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server2/server2.log JVM Arguments: -Dgemfire.locators=localhost[24079] -Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-retrieve-regions.xml -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true -Dsun.rmi.dgc.server.gcInterval=9223372036854775806 Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server3 on hemera.apache.org[29845] as server3 is currently online. Process ID: 31358 Uptime: 9 seconds GemFire Version: 1.0.0-incubating-SNAPSHOT Java Version: 1.8.0_66 Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server3/server3.log JVM Arguments: -Dgemfire.locators=localhost[24079] -Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-retrieve-regions.xml -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true -Dsun.rmi.dgc.server.gcInterval=9223372036854775806 Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar All WrappedArray(29698, 21056, 29845, 26222).length servers have been started Deploying:geode-functions_2.10-0.5.0.jar 16/09/26 15:21:38 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at: org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) ittest.org.apache.geode.spark.connector.JavaApiIntegrationTest.setUpBeforeClass(JavaApiIntegrationTest.java:75) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:497) org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) org.junit.runners.ParentRunner.run(ParentRunner.java:309) org.junit.runners.Suite.runChild(Suite.java:127) org.junit.runners.Suite.runChild(Suite.java:26) org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) org.junit.runners.ParentRunner.run(ParentRunner.java:309) [0m[[0minfo[0m] [0m[31mException encountered when attempting to run a suite with class name: ittest.org.apache.geode.spark.connector.RetrieveRegionIntegrationTest *** ABORTED ***[0m[0m [0m[[0minfo[0m] [0m[31m org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:[0m[0m [0m[[0minfo[0m] [0m[31morg.apache.spark.SparkContext.<init>(SparkContext.scala:80)[0m[0m [0m[[0minfo[0m] [0m[31mittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrationTest.scala:50)[0m[0m [0m[[0minfo[0m] [0m[31morg.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)[0m[0m [0m[[0minfo[0m] [0m[31mittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrationTest.scala:30)[0m[0m [0m[[0minfo[0m] [0m[31morg.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)[0m[0m [0m[[0minfo[0m] [0m[31mittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.run(RDDJoinRegionIntegrationTest.scala:30)[0m[0m [0m[[0minfo[0m] [0m[31morg.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)[0m[0m [0m[[0minfo[0m] [0m[31morg.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)[0m[0m [0m[[0minfo[0m] [0m[31msbt.ForkMain$Run$2.call(ForkMain.java:294)[0m[0m [0m[[0minfo[0m] [0m[31msbt.ForkMain$Run$2.call(ForkMain.java:284)[0m[0m [0m[[0minfo[0m] [0m[31mjava.util.concurrent.FutureTask.run(FutureTask.java:266)[0m[0m [0m[[0minfo[0m] [0m[31mjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[0m[0m [0m[[0minfo[0m] [0m[31mjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[0m[0m [0m[[0minfo[0m] [0m[31mjava.lang.Thread.run(Thread.java:745)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1811)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1807)[0m[0m [0m[[0minfo[0m] [0m[31m at scala.Option.foreach(Option.scala:236)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1807)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1794)[0m[0m [0m[[0minfo[0m] [0m[31m at scala.Option.foreach(Option.scala:236)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1794)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:1833)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext.<init>(SparkContext.scala:89)[0m[0m [0m[[0minfo[0m] [0m[31m at ittest.org.apache.geode.spark.connector.RetrieveRegionIntegrationTest.beforeAll(RetrieveRegionIntegrationTest.scala:51)[0m[0m [0m[[0minfo[0m] [0m[31m ...[0m[0m [0m[[0minfo[0m] [0m[32mBasicIntegrationTest:[0m[0m === GeodeRunner: stop server 1. === GeodeRunner: stop server 2. === GeodeRunner: stop server 3. === GeodeRunner: stop server 4. ............ === GeodeRunner: stop locator ... Successfully stop Geode locator at port 24079. === GeodeRunner: starting locator on port 21153 === GeodeRunner: waiting for locator on port 21153 ....=== GeodeRunner: done waiting for locator on port 21153 === GeodeRunner: starting server1 with clientPort 23625 === GeodeRunner: starting server2 with clientPort 21090 ... ..................Locator in /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/locator on hemera.apache.org[21153] as locator is currently online. Process ID: 32554 Uptime: 4 seconds GemFire Version: 1.0.0-incubating-SNAPSHOT Java Version: 1.8.0_66 Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/locator/locator.log JVM Arguments: -Dgemfire.enable-cluster-configuration=true -Dgemfire.load-cluster-configuration-from-dir=false -Dgemfire.jmx-manager-http-port=28144 -Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true -Dsun.rmi.dgc.server.gcInterval=9223372036854775806 Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar Successfully connected to: JMX Manager [host=hemera.apache.org, port=1099] Cluster configuration service is up and running. ........ Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server2 on hemera.apache.org[21090] as server2 is currently online. Process ID: 303 Uptime: 7 seconds GemFire Version: 1.0.0-incubating-SNAPSHOT Java Version: 1.8.0_66 Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server2/server2.log JVM Arguments: -Dgemfire.locators=localhost[21153] -Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-regions.xml -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true -Dsun.rmi.dgc.server.gcInterval=9223372036854775806 Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server1 on hemera.apache.org[23625] as server1 is currently online. Process ID: 300 Uptime: 7 seconds GemFire Version: 1.0.0-incubating-SNAPSHOT Java Version: 1.8.0_66 Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server1/server1.log JVM Arguments: -Dgemfire.locators=localhost[21153] -Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-regions.xml -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true -Dsun.rmi.dgc.server.gcInterval=9223372036854775806 Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar All WrappedArray(23625, 21090).length servers have been started Deploying:geode-functions_2.10-0.5.0.jar 16/09/26 15:22:04 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at: org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) ittest.org.apache.geode.spark.connector.JavaApiIntegrationTest.setUpBeforeClass(JavaApiIntegrationTest.java:75) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:497) org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) org.junit.runners.ParentRunner.run(ParentRunner.java:309) org.junit.runners.Suite.runChild(Suite.java:127) org.junit.runners.Suite.runChild(Suite.java:26) org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) org.junit.runners.ParentRunner.run(ParentRunner.java:309) [0m[[0minfo[0m] [0m[31mException encountered when attempting to run a suite with class name: ittest.org.apache.geode.spark.connector.BasicIntegrationTest *** ABORTED ***[0m[0m [0m[[0minfo[0m] [0m[31m org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:[0m[0m [0m[[0minfo[0m] [0m[31morg.apache.spark.SparkContext.<init>(SparkContext.scala:80)[0m[0m [0m[[0minfo[0m] [0m[31mittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrationTest.scala:50)[0m[0m [0m[[0minfo[0m] [0m[31morg.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)[0m[0m [0m[[0minfo[0m] [0m[31mittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrationTest.scala:30)[0m[0m [0m[[0minfo[0m] [0m[31morg.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)[0m[0m [0m[[0minfo[0m] [0m[31mittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.run(RDDJoinRegionIntegrationTest.scala:30)[0m[0m [0m[[0minfo[0m] [0m[31morg.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)[0m[0m [0m[[0minfo[0m] [0m[31morg.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)[0m[0m [0m[[0minfo[0m] [0m[31msbt.ForkMain$Run$2.call(ForkMain.java:294)[0m[0m [0m[[0minfo[0m] [0m[31msbt.ForkMain$Run$2.call(ForkMain.java:284)[0m[0m [0m[[0minfo[0m] [0m[31mjava.util.concurrent.FutureTask.run(FutureTask.java:266)[0m[0m [0m[[0minfo[0m] [0m[31mjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[0m[0m [0m[[0minfo[0m] [0m[31mjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[0m[0m [0m[[0minfo[0m] [0m[31mjava.lang.Thread.run(Thread.java:745)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1811)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1807)[0m[0m [0m[[0minfo[0m] [0m[31m at scala.Option.foreach(Option.scala:236)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1807)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1794)[0m[0m [0m[[0minfo[0m] [0m[31m at scala.Option.foreach(Option.scala:236)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1794)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:1833)[0m[0m [0m[[0minfo[0m] [0m[31m at org.apache.spark.SparkContext.<init>(SparkContext.scala:89)[0m[0m [0m[[0minfo[0m] [0m[31m at ittest.org.apache.geode.spark.connector.BasicIntegrationTest.beforeAll(BasicIntegrationTest.scala:58)[0m[0m [0m[[0minfo[0m] [0m[31m ...[0m[0m [0m[[0minfo[0m] [0mScalaTest[0m [0m[[0minfo[0m] [0m[36mRun completed in 1 minute, 58 seconds.[0m[0m [0m[[0minfo[0m] [0m[36mTotal number of tests run: 0[0m[0m [0m[[0minfo[0m] [0m[36mSuites: completed 1, aborted 3[0m[0m [0m[[0minfo[0m] [0m[36mTests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0[0m[0m [0m[[0minfo[0m] [0m[31m*** 3 SUITES ABORTED ***[0m[0m [0m[[31merror[0m] [0mError: Total 3, Failed 0, Errors 3, Passed 0[0m [0m[[31merror[0m] [0mError during tests:[0m [0m[[31merror[0m] [0m ittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest[0m [0m[[31merror[0m] [0m ittest.org.apache.geode.spark.connector.RetrieveRegionIntegrationTest[0m [0m[[31merror[0m] [0m ittest.org.apache.geode.spark.connector.BasicIntegrationTest[0m [0m[[31merror[0m] [0m(geode-spark-connector/it:[31mtest[0m) sbt.TestsFailedException: Tests unsuccessful[0m [0m[[31merror[0m] [0mTotal time: 128 s, completed Sep 26, 2016 3:22:05 PM[0m Build step 'Execute shell' marked build as failure Recording test results Skipped archiving because build is not successful