See <https://builds.apache.org/job/Geode-spark-connector/80/>

------------------------------------------
[...truncated 1883 lines...]
16/09/26 15:21:01 INFO HttpFileServer: HTTP File server directory is 
/tmp/spark-7552e30c-bb8f-45c7-be85-d259f3b7fd0c/httpd-6a47a70a-a5c7-4753-a07d-ed3f38c82625
16/09/26 15:21:01 INFO HttpServer: Starting HTTP Server
16/09/26 15:21:01 INFO Utils: Successfully started service 'HTTP file server' 
on port 36213.
16/09/26 15:21:01 INFO SparkEnv: Registering OutputCommitCoordinator
16/09/26 15:21:06 WARN Utils: Service 'SparkUI' could not bind on port 4040. 
Attempting port 4041.
16/09/26 15:21:11 INFO Utils: Successfully started service 'SparkUI' on port 
4041.
16/09/26 15:21:11 INFO SparkUI: Started SparkUI at http://localhost:4041
16/09/26 15:21:11 INFO Executor: Starting executor ID <driver> on host localhost
16/09/26 15:21:11 INFO AkkaUtils: Connecting to HeartbeatReceiver: 
akka.tcp://sparkDriver@localhost:46093/user/HeartbeatReceiver
16/09/26 15:21:11 INFO NettyBlockTransferService: Server created on 46091
16/09/26 15:21:11 INFO BlockManagerMaster: Trying to register BlockManager
16/09/26 15:21:11 INFO BlockManagerMasterActor: Registering block manager 
localhost:46091 with 2.8 GB RAM, BlockManagerId(<driver>, localhost, 46091)
16/09/26 15:21:11 INFO BlockManagerMaster: Registered BlockManager
[info] RetrieveRegionIntegrationTest:
=== GeodeRunner: stop server 1.
=== GeodeRunner: stop server 2.
......

=== GeodeRunner: stop locator
...
Successfully stop Geode locator at port 20287.
=== GeodeRunner: starting locator on port 24079
=== GeodeRunner: waiting for locator on port 24079
....=== GeodeRunner: done waiting for locator on port 24079
=== GeodeRunner: starting server1 with clientPort 29698
=== GeodeRunner: starting server2 with clientPort 21056
=== GeodeRunner: starting server3 with clientPort 29845
=== GeodeRunner: starting server4 with clientPort 26222
...
............................................Locator in 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/locator
 on hemera.apache.org[24079] as locator is currently online.
Process ID: 30902
Uptime: 4 seconds
GemFire Version: 1.0.0-incubating-SNAPSHOT
Java Version: 1.8.0_66
Log File: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/locator/locator.log
JVM Arguments: -Dgemfire.enable-cluster-configuration=true 
-Dgemfire.load-cluster-configuration-from-dir=false 
-Dgemfire.jmx-manager-http-port=24854 
-Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true 
-Dsun.rmi.dgc.server.gcInterval=9223372036854775806
Class-Path: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar

Successfully connected to: JMX Manager [host=hemera.apache.org, port=1099]

Cluster configuration service is up and running.

....................
Server in 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server4
 on hemera.apache.org[26222] as server4 is currently online.
Process ID: 31234
Uptime: 9 seconds
GemFire Version: 1.0.0-incubating-SNAPSHOT
Java Version: 1.8.0_66
Log File: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server4/server4.log
JVM Arguments: -Dgemfire.locators=localhost[24079] 
-Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost 
-Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-retrieve-regions.xml
 -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false 
-XX:OnOutOfMemoryError=kill -KILL %p 
-Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true 
-Dsun.rmi.dgc.server.gcInterval=9223372036854775806
Class-Path: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar


Server in 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server1
 on hemera.apache.org[29698] as server1 is currently online.
Process ID: 31267
Uptime: 9 seconds
GemFire Version: 1.0.0-incubating-SNAPSHOT
Java Version: 1.8.0_66
Log File: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server1/server1.log
JVM Arguments: -Dgemfire.locators=localhost[24079] 
-Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost 
-Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-retrieve-regions.xml
 -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false 
-XX:OnOutOfMemoryError=kill -KILL %p 
-Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true 
-Dsun.rmi.dgc.server.gcInterval=9223372036854775806
Class-Path: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar


Server in 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server2
 on hemera.apache.org[21056] as server2 is currently online.
Process ID: 31302
Uptime: 9 seconds
GemFire Version: 1.0.0-incubating-SNAPSHOT
Java Version: 1.8.0_66
Log File: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server2/server2.log
JVM Arguments: -Dgemfire.locators=localhost[24079] 
-Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost 
-Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-retrieve-regions.xml
 -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false 
-XX:OnOutOfMemoryError=kill -KILL %p 
-Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true 
-Dsun.rmi.dgc.server.gcInterval=9223372036854775806
Class-Path: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar


Server in 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server3
 on hemera.apache.org[29845] as server3 is currently online.
Process ID: 31358
Uptime: 9 seconds
GemFire Version: 1.0.0-incubating-SNAPSHOT
Java Version: 1.8.0_66
Log File: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server3/server3.log
JVM Arguments: -Dgemfire.locators=localhost[24079] 
-Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost 
-Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-retrieve-regions.xml
 -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false 
-XX:OnOutOfMemoryError=kill -KILL %p 
-Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true 
-Dsun.rmi.dgc.server.gcInterval=9223372036854775806
Class-Path: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar

All WrappedArray(29698, 21056, 29845, 26222).length servers have been started
Deploying:geode-functions_2.10-0.5.0.jar
16/09/26 15:21:38 WARN SparkContext: Another SparkContext is being constructed 
(or threw an exception in its constructor).  This may indicate an error, since 
only one SparkContext may be running in this JVM (see SPARK-2243). The other 
SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
ittest.org.apache.geode.spark.connector.JavaApiIntegrationTest.setUpBeforeClass(JavaApiIntegrationTest.java:75)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:497)
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
org.junit.runners.ParentRunner.run(ParentRunner.java:309)
org.junit.runners.Suite.runChild(Suite.java:127)
org.junit.runners.Suite.runChild(Suite.java:26)
org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
org.junit.runners.ParentRunner.run(ParentRunner.java:309)
[info] Exception encountered when attempting to run a 
suite with class name: 
ittest.org.apache.geode.spark.connector.RetrieveRegionIntegrationTest *** 
ABORTED ***
[info]   org.apache.spark.SparkException: Only one 
SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, 
set spark.driver.allowMultipleContexts = true. The currently running 
SparkContext was created at:
[info] 
org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
[info] 
ittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrationTest.scala:50)
[info] 
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
[info] 
ittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrationTest.scala:30)
[info] 
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
[info] 
ittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.run(RDDJoinRegionIntegrationTest.scala:30)
[info] 
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
[info] 
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
[info] sbt.ForkMain$Run$2.call(ForkMain.java:294)
[info] sbt.ForkMain$Run$2.call(ForkMain.java:284)
[info] 
java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] java.lang.Thread.run(Thread.java:745)
[info]   at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1811)
[info]   at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1807)
[info]   at scala.Option.foreach(Option.scala:236)
[info]   at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1807)
[info]   at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1794)
[info]   at scala.Option.foreach(Option.scala:236)
[info]   at 
org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1794)
[info]   at 
org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:1833)
[info]   at 
org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
[info]   at 
ittest.org.apache.geode.spark.connector.RetrieveRegionIntegrationTest.beforeAll(RetrieveRegionIntegrationTest.scala:51)
[info]   ...
[info] BasicIntegrationTest:
=== GeodeRunner: stop server 1.
=== GeodeRunner: stop server 2.
=== GeodeRunner: stop server 3.
=== GeodeRunner: stop server 4.
............



=== GeodeRunner: stop locator
...
Successfully stop Geode locator at port 24079.
=== GeodeRunner: starting locator on port 21153
=== GeodeRunner: waiting for locator on port 21153
....=== GeodeRunner: done waiting for locator on port 21153
=== GeodeRunner: starting server1 with clientPort 23625
=== GeodeRunner: starting server2 with clientPort 21090
...
..................Locator in 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/locator
 on hemera.apache.org[21153] as locator is currently online.
Process ID: 32554
Uptime: 4 seconds
GemFire Version: 1.0.0-incubating-SNAPSHOT
Java Version: 1.8.0_66
Log File: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/locator/locator.log
JVM Arguments: -Dgemfire.enable-cluster-configuration=true 
-Dgemfire.load-cluster-configuration-from-dir=false 
-Dgemfire.jmx-manager-http-port=28144 
-Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true 
-Dsun.rmi.dgc.server.gcInterval=9223372036854775806
Class-Path: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar

Successfully connected to: JMX Manager [host=hemera.apache.org, port=1099]

Cluster configuration service is up and running.

........
Server in 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server2
 on hemera.apache.org[21090] as server2 is currently online.
Process ID: 303
Uptime: 7 seconds
GemFire Version: 1.0.0-incubating-SNAPSHOT
Java Version: 1.8.0_66
Log File: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server2/server2.log
JVM Arguments: -Dgemfire.locators=localhost[21153] 
-Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost 
-Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-regions.xml
 -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false 
-XX:OnOutOfMemoryError=kill -KILL %p 
-Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true 
-Dsun.rmi.dgc.server.gcInterval=9223372036854775806
Class-Path: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar


Server in 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server1
 on hemera.apache.org[23625] as server1 is currently online.
Process ID: 300
Uptime: 7 seconds
GemFire Version: 1.0.0-incubating-SNAPSHOT
Java Version: 1.8.0_66
Log File: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/testgeode/server1/server1.log
JVM Arguments: -Dgemfire.locators=localhost[21153] 
-Dgemfire.use-cluster-configuration=true -Dgemfire.bind-address=localhost 
-Dgemfire.cache-xml-file=/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/resources/test-regions.xml
 -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false 
-XX:OnOutOfMemoryError=kill -KILL %p 
-Dgemfire.launcher.registerSignalHandlers=true -Djava.awt.headless=true 
-Dsun.rmi.dgc.server.gcInterval=9223372036854775806
Class-Path: 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-core-1.0.0-incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/./target/scala-2.10/it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-assembly/build/install/apache-geode/lib/geode-dependencies.jar

All WrappedArray(23625, 21090).length servers have been started
Deploying:geode-functions_2.10-0.5.0.jar
16/09/26 15:22:04 WARN SparkContext: Another SparkContext is being constructed 
(or threw an exception in its constructor).  This may indicate an error, since 
only one SparkContext may be running in this JVM (see SPARK-2243). The other 
SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
ittest.org.apache.geode.spark.connector.JavaApiIntegrationTest.setUpBeforeClass(JavaApiIntegrationTest.java:75)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:497)
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
org.junit.runners.ParentRunner.run(ParentRunner.java:309)
org.junit.runners.Suite.runChild(Suite.java:127)
org.junit.runners.Suite.runChild(Suite.java:26)
org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
org.junit.runners.ParentRunner.run(ParentRunner.java:309)
[info] Exception encountered when attempting to run a 
suite with class name: 
ittest.org.apache.geode.spark.connector.BasicIntegrationTest *** ABORTED 
***
[info]   org.apache.spark.SparkException: Only one 
SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, 
set spark.driver.allowMultipleContexts = true. The currently running 
SparkContext was created at:
[info] 
org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
[info] 
ittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrationTest.scala:50)
[info] 
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
[info] 
ittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrationTest.scala:30)
[info] 
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
[info] 
ittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest.run(RDDJoinRegionIntegrationTest.scala:30)
[info] 
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
[info] 
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
[info] sbt.ForkMain$Run$2.call(ForkMain.java:294)
[info] sbt.ForkMain$Run$2.call(ForkMain.java:284)
[info] 
java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] java.lang.Thread.run(Thread.java:745)
[info]   at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1811)
[info]   at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1807)
[info]   at scala.Option.foreach(Option.scala:236)
[info]   at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1807)
[info]   at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1794)
[info]   at scala.Option.foreach(Option.scala:236)
[info]   at 
org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1794)
[info]   at 
org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:1833)
[info]   at 
org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
[info]   at 
ittest.org.apache.geode.spark.connector.BasicIntegrationTest.beforeAll(BasicIntegrationTest.scala:58)
[info]   ...
[info] ScalaTest
[info] Run completed in 1 minute, 58 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 1, aborted 3
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 
0, pending 0
[info] *** 3 SUITES ABORTED ***
[error] Error: Total 3, Failed 0, Errors 3, Passed 0
[error] Error during tests:
[error]        
ittest.org.apache.geode.spark.connector.RDDJoinRegionIntegrationTest
[error]        
ittest.org.apache.geode.spark.connector.RetrieveRegionIntegrationTest
[error]        
ittest.org.apache.geode.spark.connector.BasicIntegrationTest
[error] (geode-spark-connector/it:test) 
sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 128 s, completed Sep 26, 2016 3:22:05 
PM
Build step 'Execute shell' marked build as failure
Recording test results
Skipped archiving because build is not successful

Reply via email to