I checked in a fix for the current dependencies on spring-core (thanks Kirk
and Udo). But we need to work on avoiding this issue in the future. Having
"optional" dependencies in the core seems like the main issue; a secondary
issue is that we don't have tests of geode-core that run with just the
non-optional geode-core dependencies available. Well, actually we do have
some tests in geode-examples, but apparently those aren't running as part
of precheckin! I filed GEODE-1937 for that.

-Dan

On Fri, Sep 23, 2016 at 8:15 PM, Kirk Lund <kl...@apache.org> wrote:

> org.apache.geode.internal.lang.StringUtils includes isEmpty(String)
>
> -Kirk
>
> On Friday, September 23, 2016, Udo Kohlmeyer <ukohlme...@pivotal.io>
> wrote:
>
> > I can easily fix this.
> >
> > Sure we have a utility lying around in the core that can handle
> > "String.isEmpty"
> >
> > --Udo
> >
> >
> > On 24/09/2016 9:56 AM, Anthony Baker wrote:
> >
> >> Yep, I’m seeing failures on any client app that doesn’t explicitly
> >> include spring as dependency.
> >>
> >> Exception in thread "main" java.lang.NoClassDefFoundError:
> >> org/springframework/util/StringUtils
> >>         at org.apache.geode.internal.net.SSLConfigurationFactory.config
> >> ureSSLPropertiesFromSystemProperties(SSLConfigurationFactory.java:274)
> >>         at org.apache.geode.internal.net.SSLConfigurationFactory.config
> >> ureSSLPropertiesFromSystemProperties(SSLConfigurationFactory.java:270)
> >>         at org.apache.geode.internal.net.SSLConfigurationFactory.create
> >> SSLConfigForComponent(SSLConfigurationFactory.java:138)
> >>         at org.apache.geode.internal.net.SSLConfigurationFactory.getSSL
> >> ConfigForComponent(SSLConfigurationFactory.java:67)
> >>         at org.apache.geode.internal.net.SocketCreatorFactory.getSocket
> >> CreatorForComponent(SocketCreatorFactory.java:67)
> >>         at org.apache.geode.distributed.internal.tcpserver.TcpClient.<i
> >> nit>(TcpClient.java:69)
> >>         at org.apache.geode.cache.client.internal.AutoConnectionSourceI
> >> mpl.<init>(AutoConnectionSourceImpl.java:114)
> >>         at org.apache.geode.cache.client.internal.PoolImpl.getSourceImp
> >> l(PoolImpl.java:579)
> >>         at org.apache.geode.cache.client.internal.PoolImpl.<init>(PoolI
> >> mpl.java:219)
> >>         at org.apache.geode.cache.client.internal.PoolImpl.create(PoolI
> >> mpl.java:132)
> >>         at org.apache.geode.internal.cache.PoolFactoryImpl.create(PoolF
> >> actoryImpl.java:319)
> >>         at org.apache.geode.internal.cache.GemFireCacheImpl.determineDe
> >> faultPool(GemFireCacheImpl.java:2943)
> >>         at org.apache.geode.internal.cache.GemFireCacheImpl.initializeD
> >> eclarativeCache(GemFireCacheImpl.java:1293)
> >>         at org.apache.geode.internal.cache.GemFireCacheImpl.initialize(
> >> GemFireCacheImpl.java:1124)
> >>         at org.apache.geode.internal.cache.GemFireCacheImpl.basicCreate
> >> (GemFireCacheImpl.java:765)
> >>         at org.apache.geode.internal.cache.GemFireCacheImpl.createClien
> >> t(GemFireCacheImpl.java:740)
> >>         at org.apache.geode.cache.client.ClientCacheFactory.basicCreate
> >> (ClientCacheFactory.java:235)
> >>         at org.apache.geode.cache.client.ClientCacheFactory.create(Clie
> >> ntCacheFactory.java:189)
> >>         at HelloWorld.main(HelloWorld.java:25)
> >> Caused by: java.lang.ClassNotFoundException:
> >> org.springframework.util.StringUtils
> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> >>         at sun.misc.Launcher$AppClassLoader.loadClass(
> Launcher.java:331)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> >>         ... 19 more
> >>
> >> Anthony
> >>
> >>
> >> On Sep 23, 2016, at 4:34 PM, Dan Smith <dsm...@pivotal.io> wrote:
> >>>
> >>> I created GEODE-1934 for this. It looks like the problem is actually
> that
> >>> our dependencies for geode-core are messed up. spring-core is marked
> >>> optional, but we're using it in critical places like this
> >>> SSLConfigurationFactory.
> >>>
> >>> In my opinion we shouldn't depend on spring-core at all unless we're
> >>> actually going to use it for things other than StringUtils. I think
> we've
> >>> accidentally introduced dependencies on it because the gfsh code in the
> >>> core is pulling in a bunch of spring libraries.
> >>>
> >>> -Dan
> >>>
> >>>
> >>> On Fri, Sep 23, 2016 at 9:12 AM, Apache Jenkins Server <
> >>> jenk...@builds.apache.org> wrote:
> >>>
> >>> See <https://builds.apache.org/job/Geode-spark-connector/78/changes>
> >>>>
> >>>> Changes:
> >>>>
> >>>> [hkhamesra] GEODE-37 In spark connector we call TcpClient static
> method
> >>>> to
> >>>> get the
> >>>>
> >>>> [klund] GEODE-1906: fix misspelling of Successfully
> >>>>
> >>>> [upthewaterspout] GEODE-1915: Prevent deadlock registering
> instantiators
> >>>> with gateways
> >>>>
> >>>> ------------------------------------------
> >>>> [...truncated 1883 lines...]
> >>>> 16/09/23 16:11:05 INFO HttpFileServer: HTTP File server directory is
> >>>> /tmp/spark-f13dac55-087f-4379-aeed-616fbdc7ffac/httpd-
> >>>> 02c1fab9-faa0-47f4-b0f3-fd44383eeeb3
> >>>> 16/09/23 16:11:05 INFO HttpServer: Starting HTTP Server
> >>>> 16/09/23 16:11:05 INFO Utils: Successfully started service 'HTTP file
> >>>> server' on port 40135.
> >>>> 16/09/23 16:11:05 INFO SparkEnv: Registering OutputCommitCoordinator
> >>>> 16/09/23 16:11:10 WARN Utils: Service 'SparkUI' could not bind on port
> >>>> 4040. Attempting port 4041.
> >>>> 16/09/23 16:11:15 INFO Utils: Successfully started service 'SparkUI'
> on
> >>>> port 4041.
> >>>> 16/09/23 16:11:15 INFO SparkUI: Started SparkUI at
> >>>> http://localhost:4041
> >>>> 16/09/23 16:11:15 INFO Executor: Starting executor ID <driver> on host
> >>>> localhost
> >>>> 16/09/23 16:11:15 INFO AkkaUtils: Connecting to HeartbeatReceiver:
> >>>> akka.tcp://sparkDriver@localhost:54872/user/HeartbeatReceiver
> >>>> 16/09/23 16:11:16 INFO NettyBlockTransferService: Server created on
> >>>> 41182
> >>>> 16/09/23 16:11:16 INFO BlockManagerMaster: Trying to register
> >>>> BlockManager
> >>>> 16/09/23 16:11:16 INFO BlockManagerMasterActor: Registering block
> >>>> manager
> >>>> localhost:41182 with 2.8 GB RAM, BlockManagerId(<driver>, localhost,
> >>>> 41182)
> >>>> 16/09/23 16:11:16 INFO BlockManagerMaster: Registered BlockManager
> >>>> === GeodeRunner: stop server 1.
> >>>> === GeodeRunner: stop server 2.
> >>>> [0m[ [0minfo [0m]  [0m [32mRetrieveRegionIntegrationTest: [0m [0m
> >>>> ......
> >>>>
> >>>> === GeodeRunner: stop locator
> >>>> ...
> >>>> Successfully stop Geode locator at port 27662.
> >>>> === GeodeRunner: starting locator on port 23825
> >>>> === GeodeRunner: waiting for locator on port 23825
> >>>> ....=== GeodeRunner: done waiting for locator on port 23825
> >>>> === GeodeRunner: starting server1 with clientPort 28993
> >>>> === GeodeRunner: starting server2 with clientPort 26318
> >>>> === GeodeRunner: starting server3 with clientPort 29777
> >>>> === GeodeRunner: starting server4 with clientPort 22946
> >>>> ....
> >>>> ............................................Locator in
> >>>> /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> connector/geode-spark-
> >>>> connector/geode-spark-connector/target/testgeode/locator on
> >>>> hemera.apache.org[23825] as locator is currently online.
> >>>> Process ID: 1860
> >>>> Uptime: 4 seconds
> >>>> GemFire Version: 1.0.0-incubating-SNAPSHOT
> >>>> Java Version: 1.8.0_66
> >>>> Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-
> connector/target/testgeode/
> >>>> locator/locator.log
> >>>> JVM Arguments: -Dgemfire.enable-cluster-configuration=true
> >>>> -Dgemfire.load-cluster-configuration-from-dir=false
> >>>> -Dgemfire.jmx-manager-http-port=29684 -Dgemfire.launcher.registerSig
> >>>> nalHandlers=true
> >>>> -Djava.awt.headless=true -Dsun.rmi.dgc.server.
> >>>> gcInterval=9223372036854775806
> >>>> Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/lib/
> >>>> geode-core-1.0.0-
> >>>> incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/
> >>>> Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/
> >>>> lib/geode-dependencies.jar
> >>>>
> >>>> Successfully connected to: JMX Manager [host=hemera.apache.org,
> >>>> port=1099]
> >>>>
> >>>> Cluster configuration service is up and running.
> >>>>
> >>>> ................
> >>>> Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/
> >>>> target/testgeode/server4
> >>>> on hemera.apache.org[22946] as server4 is currently online.
> >>>> Process ID: 2204
> >>>> Uptime: 8 seconds
> >>>> GemFire Version: 1.0.0-incubating-SNAPSHOT
> >>>> Java Version: 1.8.0_66
> >>>> Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-
> connector/target/testgeode/
> >>>> server4/server4.log
> >>>> JVM Arguments: -Dgemfire.locators=localhost[23825]
> >>>> -Dgemfire.use-cluster-configuration=true
> >>>> -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/
> >>>> jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-
> >>>> connector/geode-spark-connector/src/it/resources/test-
> >>>> retrieve-regions.xml
> >>>> -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false
> >>>> -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSig
> >>>> nalHandlers=true
> >>>> -Djava.awt.headless=true -Dsun.rmi.dgc.server.
> >>>> gcInterval=9223372036854775806
> >>>> Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/lib/
> >>>> geode-core-1.0.0-
> >>>> incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/
> >>>> Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/./
> >>>> target/scala-2.10/
> >>>> it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-
> >>>> spark-connector/geode-assembly/build/install/apache-
> >>>> geode/lib/geode-dependencies.jar
> >>>>
> >>>> ..
> >>>> Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/
> >>>> target/testgeode/server1
> >>>> on hemera.apache.org[28993] as server1 is currently online.
> >>>> Process ID: 2199
> >>>> Uptime: 8 seconds
> >>>> GemFire Version: 1.0.0-incubating-SNAPSHOT
> >>>> Java Version: 1.8.0_66
> >>>> Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-
> connector/target/testgeode/
> >>>> server1/server1.log
> >>>> JVM Arguments: -Dgemfire.locators=localhost[23825]
> >>>> -Dgemfire.use-cluster-configuration=true
> >>>> -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/
> >>>> jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-
> >>>> connector/geode-spark-connector/src/it/resources/test-
> >>>> retrieve-regions.xml
> >>>> -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false
> >>>> -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSig
> >>>> nalHandlers=true
> >>>> -Djava.awt.headless=true -Dsun.rmi.dgc.server.
> >>>> gcInterval=9223372036854775806
> >>>> Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/lib/
> >>>> geode-core-1.0.0-
> >>>> incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/
> >>>> Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/./
> >>>> target/scala-2.10/
> >>>> it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-
> >>>> spark-connector/geode-assembly/build/install/apache-
> >>>> geode/lib/geode-dependencies.jar
> >>>>
> >>>>
> >>>>
> >>>> Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/
> >>>> target/testgeode/server2
> >>>> on hemera.apache.org[26318] as server2 is currently online.
> >>>> Process ID: 2153
> >>>> Uptime: 9 seconds
> >>>> GemFire Version: 1.0.0-incubating-SNAPSHOT
> >>>> Java Version: 1.8.0_66
> >>>> Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-
> connector/target/testgeode/
> >>>> server2/server2.log
> >>>> JVM Arguments: -Dgemfire.locators=localhost[23825]
> >>>> -Dgemfire.use-cluster-configuration=true
> >>>> -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/
> >>>> jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-
> >>>> connector/geode-spark-connector/src/it/resources/test-
> >>>> retrieve-regions.xml
> >>>> -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false
> >>>> -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSig
> >>>> nalHandlers=true
> >>>> -Djava.awt.headless=true -Dsun.rmi.dgc.server.
> >>>> gcInterval=9223372036854775806
> >>>> Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/lib/
> >>>> geode-core-1.0.0-
> >>>> incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/
> >>>> Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/./
> >>>> target/scala-2.10/
> >>>> it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-
> >>>> spark-connector/geode-assembly/build/install/apache-
> >>>> geode/lib/geode-dependencies.jar
> >>>>
> >>>> Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/
> >>>> target/testgeode/server3
> >>>> on hemera.apache.org[29777] as server3 is currently online.
> >>>> Process ID: 2175
> >>>> Uptime: 9 seconds
> >>>> GemFire Version: 1.0.0-incubating-SNAPSHOT
> >>>> Java Version: 1.8.0_66
> >>>> Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-
> connector/target/testgeode/
> >>>> server3/server3.log
> >>>> JVM Arguments: -Dgemfire.locators=localhost[23825]
> >>>> -Dgemfire.use-cluster-configuration=true
> >>>> -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/
> >>>> jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-
> >>>> connector/geode-spark-connector/src/it/resources/test-
> >>>> retrieve-regions.xml
> >>>> -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false
> >>>> -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSig
> >>>> nalHandlers=true
> >>>> -Djava.awt.headless=true -Dsun.rmi.dgc.server.
> >>>> gcInterval=9223372036854775806
> >>>> Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/lib/
> >>>> geode-core-1.0.0-
> >>>> incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/
> >>>> Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/./
> >>>> target/scala-2.10/
> >>>> it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-
> >>>> spark-connector/geode-assembly/build/install/apache-
> >>>> geode/lib/geode-dependencies.jar
> >>>>
> >>>> All WrappedArray(28993, 26318, 29777, 22946).length servers have been
> >>>> started
> >>>> Deploying:geode-functions_2.10-0.5.0.jar
> >>>> 16/09/23 16:11:43 WARN SparkContext: Another SparkContext is being
> >>>> constructed (or threw an exception in its constructor).  This may
> >>>> indicate
> >>>> an error, since only one SparkContext may be running in this JVM (see
> >>>> SPARK-2243). The other SparkContext was created at:
> >>>> org.apache.spark.api.java.JavaSparkContext.<init>(
> >>>> JavaSparkContext.scala:61)
> >>>> ittest.org.apache.geode.spark.connector.JavaApiIntegrationTest.
> >>>> setUpBeforeClass(JavaApiIntegrationTest.java:75)
> >>>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
> >>>> ssorImpl.java:
> >>>> 62)
> >>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
> >>>> DelegatingMethodAccessorImpl.java:43)
> >>>> java.lang.reflect.Method.invoke(Method.java:497)
> >>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(
> >>>> FrameworkMethod.java:47)
> >>>> org.junit.internal.runners.model.ReflectiveCallable.run(
> >>>> ReflectiveCallable.java:12)
> >>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(
> >>>> FrameworkMethod.java:44)
> >>>> org.junit.internal.runners.statements.RunBefores.
> >>>> evaluate(RunBefores.java:24)
> >>>> org.junit.internal.runners.statements.RunAfters.evaluate(
> >>>> RunAfters.java:27)
> >>>> org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> >>>> org.junit.runners.Suite.runChild(Suite.java:127)
> >>>> org.junit.runners.Suite.runChild(Suite.java:26)
> >>>> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> >>>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> >>>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> >>>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> >>>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> >>>> org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> >>>> [0m[ [0minfo [0m]  [0m [31mException encountered when attempting to
> run
> >>>> a
> >>>> suite with class name: ittest.org.apache.geode.spark.
> >>>> connector.RetrieveRegionIntegrationTest
> >>>> *** ABORTED *** [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  org.apache.spark.SparkException: Only
> one
> >>>> SparkContext may be running in this JVM (see SPARK-2243). To ignore
> this
> >>>> error, set spark.driver.allowMultipleContexts = true. The currently
> >>>> running SparkContext was created at: [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.apache.spark.SparkCont
> >>>> ext.<init>(SparkContext.scala:80)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mittest.org.apache.geode.spark.connector.
> >>>> RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrat
> >>>> ionTest.scala:50)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.scalatest.BeforeAndAfterAll$class.
> >>>> beforeAll(BeforeAndAfterAll.scala:187) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mittest.org.apache.geode.spark.connector.
> >>>> RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrat
> >>>> ionTest.scala:30)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.scalatest.BeforeAndAft
> >>>> erAll$class.run(BeforeAndAfterAll.scala:253)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mittest.org.apache.geode.spark.connector.
> >>>> RDDJoinRegionIntegrationTest.run(RDDJoinRegionIntegrationTest.
> scala:30)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.scalatest.tools.Framework.org
> >>>> $scalatest$tools$Framework$$runSuite(Framework.scala:462) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.scalatest.tools.Framew
> >>>> ork$ScalaTestTask.execute(Framework.scala:671)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31msbt.ForkMain$Run$2.call(ForkMain.java:294)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31msbt.ForkMain$Run$2.call(ForkMain.java:284)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mjava.util.concurrent.
> >>>> FutureTask.run(FutureTask.java:266) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mjava.util.concurrent.
> >>>> ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mjava.util.concurrent.
> >>>> ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mjava.lang.Thread.run(Thread.java:745) [0m
> >>>> [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> $anonfun$
> >>>> assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(Spar
> >>>> kContext.scala:1811)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> $anonfun$
> >>>> assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(Spar
> >>>> kContext.scala:1807)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at scala.Option.foreach(Option.
> scala:236)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> $anonfun$
> >>>> assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1807) [0m
> [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> $anonfun$
> >>>> assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1794) [0m
> [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at scala.Option.foreach(Option.
> scala:236)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$.
> >>>> assertNoOtherContextIsRunning(SparkContext.scala:1794) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> >>>> .markPartiallyConstructed(SparkContext.scala:1833) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext.
> >>>> <init>(SparkContext.scala:89)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at ittest.org.apache.geode.spark.
> >>>> connector.
> >>>> RetrieveRegionIntegrationTest.beforeAll(RetrieveRegionIntegr
> >>>> ationTest.scala:51)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  ... [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [32mBasicIntegrationTest: [0m [0m
> >>>> === GeodeRunner: stop server 1.
> >>>> === GeodeRunner: stop server 2.
> >>>> === GeodeRunner: stop server 3.
> >>>> === GeodeRunner: stop server 4.
> >>>> ............
> >>>>
> >>>>
> >>>>
> >>>> === GeodeRunner: stop locator
> >>>> ...
> >>>> Successfully stop Geode locator at port 23825.
> >>>> === GeodeRunner: starting locator on port 23573
> >>>> === GeodeRunner: waiting for locator on port 23573
> >>>> ....=== GeodeRunner: done waiting for locator on port 23573
> >>>> === GeodeRunner: starting server1 with clientPort 27897
> >>>> === GeodeRunner: starting server2 with clientPort 20289
> >>>> ....
> >>>> ....................Locator in /x1/jenkins/jenkins-slave/
> >>>> workspace/Geode-spark-connector/geode-spark-connector/geode-spark-
> >>>> connector/target/testgeode/locator on hemera.apache.org[23573] as
> >>>> locator
> >>>> is currently online.
> >>>> Process ID: 3273
> >>>> Uptime: 4 seconds
> >>>> GemFire Version: 1.0.0-incubating-SNAPSHOT
> >>>> Java Version: 1.8.0_66
> >>>> Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-
> connector/target/testgeode/
> >>>> locator/locator.log
> >>>> JVM Arguments: -Dgemfire.enable-cluster-configuration=true
> >>>> -Dgemfire.load-cluster-configuration-from-dir=false
> >>>> -Dgemfire.jmx-manager-http-port=23053 -Dgemfire.launcher.registerSig
> >>>> nalHandlers=true
> >>>> -Djava.awt.headless=true -Dsun.rmi.dgc.server.
> >>>> gcInterval=9223372036854775806
> >>>> Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/lib/
> >>>> geode-core-1.0.0-
> >>>> incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/
> >>>> Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/
> >>>> lib/geode-dependencies.jar
> >>>>
> >>>> Successfully connected to: JMX Manager [host=hemera.apache.org,
> >>>> port=1099]
> >>>>
> >>>> Cluster configuration service is up and running.
> >>>>
> >>>> ........
> >>>> Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/
> >>>> target/testgeode/server2
> >>>> on hemera.apache.org[20289] as server2 is currently online.
> >>>> Process ID: 3465
> >>>> Uptime: 7 seconds
> >>>> GemFire Version: 1.0.0-incubating-SNAPSHOT
> >>>> Java Version: 1.8.0_66
> >>>> Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-
> connector/target/testgeode/
> >>>> server2/server2.log
> >>>> JVM Arguments: -Dgemfire.locators=localhost[23573]
> >>>> -Dgemfire.use-cluster-configuration=true
> >>>> -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/
> >>>> jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-
> >>>> connector/geode-spark-connector/src/it/resources/test-regions.xml
> >>>> -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false
> >>>> -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSig
> >>>> nalHandlers=true
> >>>> -Djava.awt.headless=true -Dsun.rmi.dgc.server.
> >>>> gcInterval=9223372036854775806
> >>>> Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/lib/
> >>>> geode-core-1.0.0-
> >>>> incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/
> >>>> Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/./
> >>>> target/scala-2.10/
> >>>> it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-
> >>>> spark-connector/geode-assembly/build/install/apache-
> >>>> geode/lib/geode-dependencies.jar
> >>>>
> >>>>
> >>>> Server in /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/
> >>>> target/testgeode/server1
> >>>> on hemera.apache.org[27897] as server1 is currently online.
> >>>> Process ID: 3505
> >>>> Uptime: 7 seconds
> >>>> GemFire Version: 1.0.0-incubating-SNAPSHOT
> >>>> Java Version: 1.8.0_66
> >>>> Log File: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-
> connector/target/testgeode/
> >>>> server1/server1.log
> >>>> JVM Arguments: -Dgemfire.locators=localhost[23573]
> >>>> -Dgemfire.use-cluster-configuration=true
> >>>> -Dgemfire.bind-address=localhost -Dgemfire.cache-xml-file=/x1/
> >>>> jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-
> >>>> connector/geode-spark-connector/src/it/resources/test-regions.xml
> >>>> -Dgemfire.http-service-port=8080 -Dgemfire.start-dev-rest-api=false
> >>>> -XX:OnOutOfMemoryError=kill -KILL %p -Dgemfire.launcher.registerSig
> >>>> nalHandlers=true
> >>>> -Djava.awt.headless=true -Dsun.rmi.dgc.server.
> >>>> gcInterval=9223372036854775806
> >>>> Class-Path: /x1/jenkins/jenkins-slave/workspace/Geode-spark-
> >>>> connector/geode-assembly/build/install/apache-geode/lib/
> >>>> geode-core-1.0.0-
> >>>> incubating-SNAPSHOT.jar:/x1/jenkins/jenkins-slave/workspace/
> >>>> Geode-spark-
> >>>> connector/geode-spark-connector/geode-spark-connector/./
> >>>> target/scala-2.10/
> >>>> it-classes:/x1/jenkins/jenkins-slave/workspace/Geode-
> >>>> spark-connector/geode-assembly/build/install/apache-
> >>>> geode/lib/geode-dependencies.jar
> >>>>
> >>>> All WrappedArray(27897, 20289).length servers have been started
> >>>> Deploying:geode-functions_2.10-0.5.0.jar
> >>>> 16/09/23 16:12:09 WARN SparkContext: Another SparkContext is being
> >>>> constructed (or threw an exception in its constructor).  This may
> >>>> indicate
> >>>> an error, since only one SparkContext may be running in this JVM (see
> >>>> SPARK-2243). The other SparkContext was created at:
> >>>> org.apache.spark.api.java.JavaSparkContext.<init>(
> >>>> JavaSparkContext.scala:61)
> >>>> ittest.org.apache.geode.spark.connector.JavaApiIntegrationTest.
> >>>> setUpBeforeClass(JavaApiIntegrationTest.java:75)
> >>>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
> >>>> ssorImpl.java:
> >>>> 62)
> >>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
> >>>> DelegatingMethodAccessorImpl.java:43)
> >>>> java.lang.reflect.Method.invoke(Method.java:497)
> >>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(
> >>>> FrameworkMethod.java:47)
> >>>> org.junit.internal.runners.model.ReflectiveCallable.run(
> >>>> ReflectiveCallable.java:12)
> >>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(
> >>>> FrameworkMethod.java:44)
> >>>> org.junit.internal.runners.statements.RunBefores.
> >>>> evaluate(RunBefores.java:24)
> >>>> org.junit.internal.runners.statements.RunAfters.evaluate(
> >>>> RunAfters.java:27)
> >>>> org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> >>>> org.junit.runners.Suite.runChild(Suite.java:127)
> >>>> org.junit.runners.Suite.runChild(Suite.java:26)
> >>>> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> >>>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> >>>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> >>>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> >>>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> >>>> org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> >>>> [0m[ [0minfo [0m]  [0m [31mException encountered when attempting to
> run
> >>>> a
> >>>> suite with class name: ittest.org.apache.geode.spark.
> >>>> connector.BasicIntegrationTest
> >>>> *** ABORTED *** [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  org.apache.spark.SparkException: Only
> one
> >>>> SparkContext may be running in this JVM (see SPARK-2243). To ignore
> this
> >>>> error, set spark.driver.allowMultipleContexts = true. The currently
> >>>> running SparkContext was created at: [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.apache.spark.SparkCont
> >>>> ext.<init>(SparkContext.scala:80)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mittest.org.apache.geode.spark.connector.
> >>>> RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrat
> >>>> ionTest.scala:50)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.scalatest.BeforeAndAfterAll$class.
> >>>> beforeAll(BeforeAndAfterAll.scala:187) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mittest.org.apache.geode.spark.connector.
> >>>> RDDJoinRegionIntegrationTest.beforeAll(RDDJoinRegionIntegrat
> >>>> ionTest.scala:30)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.scalatest.BeforeAndAft
> >>>> erAll$class.run(BeforeAndAfterAll.scala:253)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mittest.org.apache.geode.spark.connector.
> >>>> RDDJoinRegionIntegrationTest.run(RDDJoinRegionIntegrationTest.
> scala:30)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.scalatest.tools.Framework.org
> >>>> $scalatest$tools$Framework$$runSuite(Framework.scala:462) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31morg.scalatest.tools.Framew
> >>>> ork$ScalaTestTask.execute(Framework.scala:671)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31msbt.ForkMain$Run$2.call(ForkMain.java:294)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31msbt.ForkMain$Run$2.call(ForkMain.java:284)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mjava.util.concurrent.
> >>>> FutureTask.run(FutureTask.java:266) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mjava.util.concurrent.
> >>>> ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mjava.util.concurrent.
> >>>> ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31mjava.lang.Thread.run(Thread.java:745) [0m
> >>>> [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> $anonfun$
> >>>> assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(Spar
> >>>> kContext.scala:1811)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> $anonfun$
> >>>> assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(Spar
> >>>> kContext.scala:1807)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at scala.Option.foreach(Option.
> scala:236)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> $anonfun$
> >>>> assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1807) [0m
> [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> $anonfun$
> >>>> assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1794) [0m
> [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at scala.Option.foreach(Option.
> scala:236)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$.
> >>>> assertNoOtherContextIsRunning(SparkContext.scala:1794) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext$
> >>>> .markPartiallyConstructed(SparkContext.scala:1833) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at org.apache.spark.SparkContext.
> >>>> <init>(SparkContext.scala:89)
> >>>> [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  at ittest.org.apache.geode.spark.
> >>>> connector.
> >>>> BasicIntegrationTest.beforeAll(BasicIntegrationTest.scala:58) [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m  ... [0m [0m
> >>>> [0m[ [0minfo [0m]  [0mScalaTest [0m
> >>>> [0m[ [0minfo [0m]  [0m [36mRun completed in 1 minute, 59 seconds. [0m
> >>>> [0m
> >>>> [0m[ [0minfo [0m]  [0m [36mTotal number of tests run: 0 [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [36mSuites: completed 1, aborted 3 [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [36mTests: succeeded 0, failed 0, canceled 0,
> >>>> ignored 0, pending 0 [0m [0m
> >>>> [0m[ [0minfo [0m]  [0m [31m*** 3 SUITES ABORTED *** [0m [0m
> >>>> [0m[ [31merror [0m]  [0mError: Total 3, Failed 0, Errors 3, Passed 0
> [0m
> >>>> [0m[ [31merror [0m]  [0mError during tests: [0m
> >>>> [0m[ [31merror [0m]  [0m       ittest.org.apache.geode.
> >>>> spark.connector.RDDJoinRegionIntegrationTest
> >>>> [0m
> >>>> [0m[ [31merror [0m]  [0m       ittest.org.apache.geode.
> >>>> spark.connector.RetrieveRegionIntegrationTest
> >>>> [0m
> >>>> [0m[ [31merror [0m]  [0m       ittest.org.apache.geode.spark.
> connector.
> >>>> BasicIntegrationTest
> >>>> [0m
> >>>> [0m[ [31merror [0m]  [0m(geode-spark-connector/it: [31mtest [0m)
> >>>> sbt.TestsFailedException: Tests unsuccessful [0m
> >>>> [0m[ [31merror [0m]  [0mTotal time: 128 s, completed Sep 23, 2016
> >>>> 4:12:09
> >>>> PM [0m
> >>>> Build step 'Execute shell' marked build as failure
> >>>> Recording test results
> >>>> Skipped archiving because build is not successful
> >>>>
> >>>>
> >
>

Reply via email to