See <https://builds.apache.org/job/Geode-spark-connector/77/changes>
Changes: [hkhamesra] GEODE-1908 DS.connect call not configuring p2p server with SSL. [hkhamesra] GEODE-37 re-merge commit 6a109a7f57a5402af458d593d93c2e328bd3281b [klund] GEODE-1905: Address UnknownPropertyException when running tests with [ukohlmeyer] GEODE-1524: Updating swagger annotations to 1.3.13 Removing of escape [ukohlmeyer] GEODE-1524: Reverting Swagger version [ukohlmeyer] GEODE-420: Renaming SSLConfigurationFactoryTest.java to [dschneider] GEODE-1860: change unit test to wait longer [hkhamesra] GEODE-37 changed sequence.gemfire to org.apache.geode [hkhamesra] GEODE-37 renamed pulse package to geode [hkhamesra] GEODE-37 changed package name in comment [hkhamesra] GEODE-37 import change in pulse module [bschuchardt] fixing broken swizzling code in DataSerializer [ukohlmeyer] GEODE-420: removed JUnit4DistributedTestCase from Test to make it JUnit [hkhamesra] GEODE-37 change package name from io.pivotal.geode (for [hkhamesra] GEODE-37 change package name from io.pivotal.geode (for [hkhamesra] GEODE-37 change package name from io.pivotal.geode (for [hkhamesra] GEODE-37 change package name from io.pivotal.geode (for [hkhamesra] GEODE-37 change package name from io.pivotal.geode (for [hkhamesra] GEODE-37 change package name from io.pivotal.geode (for [hkhamesra] GEODE-37 change package name from io.pivotal.geode (for [hkhamesra] GEODE-37 change package name from io.pivotal.geode (for [hkhamesra] GEODE-37 changed package name in spark-connector [hkhamesra] GEODE-37 changed md file to point org.apache [jiliao] Marked test guarenteed to take at least 1 minute as an integration test [abaker] GEODE-1513: Remove duplicate jars from war files [abaker] GEODE-1791: update mbean src headers [abaker] GEODE-1791: Update LICENSE [nnag] GEODE-1925 : Copy the resources directory inside build directory [dschneider] GEODE-1128: Add missing regions to the missing-disk-stores command [wmarkito] GEODE-1903: Exchange old events for new [wmarkito] Close #241 GEODE-1903: Correct typo [huynhja] GEODE-1864: Remove old keys correctly from Compact Map Range Index [huynhja] GEODE-1675: Fixed reevaluation with map range index non tuples [jiliao] GEODE-1909: add authorization in GMSAuthenticator ------------------------------------------ [...truncated 990 lines...] M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-common;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-common;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-api;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-api;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.slf4j#slf4j-log4j12;1.7.5 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.google.inject#guice;3.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving javax.inject#javax.inject;1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving aopalliance#aopalliance;1.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.sonatype.sisu.inject#cglib;2.2.1-v20090111 ...[0m M[2K[0m[[0minfo[0m] [0mResolving asm#asm;3.2 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.sun.jersey.jersey-test-framework#jersey-test-framework-grizzly2;1.9 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.sun.jersey#jersey-server;1.9 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.sun.jersey#jersey-json;1.9 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jettison#jettison;1.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving stax#stax-api;1.0.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.sun.xml.bind#jaxb-impl;2.2.3-1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving javax.xml.bind#jaxb-api;2.2.2 ...[0m M[2K[0m[[0minfo[0m] [0mResolving javax.activation#activation;1.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jackson#jackson-jaxrs;1.8.8 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jackson#jackson-xc;1.8.8 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.sun.jersey.contribs#jersey-guice;1.9 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-client;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-client;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client-core;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client-core;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server-common;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server-common;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-mapreduce-client;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.spark#spark-network-common_2.10;1.3.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving io.netty#netty-all;4.0.23.Final ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.spark-project.spark#unused;1.0.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.spark#spark-network-shuffle_2.10;1.3.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving net.java.dev.jets3t#jets3t;0.7.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.curator#curator-recipes;2.4.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.curator#curator-framework;2.4.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.curator#curator-client;2.4.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.zookeeper#zookeeper;3.4.5 ...[0m M[2K[0m[[0minfo[0m] [0mResolving jline#jline;0.9.94 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.google.guava#guava;14.0.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.commons#commons-lang3;3.3.2 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.commons#commons-math3;3.1.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.slf4j#slf4j-api;1.7.10 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.slf4j#jul-to-slf4j;1.7.10 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.slf4j#jcl-over-slf4j;1.7.10 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.slf4j#slf4j-log4j12;1.7.10 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.ning#compress-lzf;1.0.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.xerial.snappy#snappy-java;1.1.1.6 ...[0m M[2K[0m[[0minfo[0m] [0mResolving net.jpountz.lz4#lz4;1.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.roaringbitmap#RoaringBitmap;0.4.5 ...[0m M[2K[0m[[0minfo[0m] [0mResolving commons-net#commons-net;2.2 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.spark-project.akka#akka-remote_2.10;2.3.4-spark ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.spark-project.akka#akka-actor_2.10;2.3.4-spark ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.typesafe#config;1.2.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving io.netty#netty;3.8.0.Final ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.spark-project.protobuf#protobuf-java;2.5.0-spark ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.uncommons.maths#uncommons-maths;1.2.2a ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.spark-project.akka#akka-slf4j_2.10;2.3.4-spark ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.json4s#json4s-jackson_2.10;3.2.10 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.json4s#json4s-core_2.10;3.2.10 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.json4s#json4s-ast_2.10;3.2.10 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.thoughtworks.paranamer#paranamer;2.6 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#scalap;2.10.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#scala-compiler;2.10.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#scala-reflect;2.10.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.mesos#mesos;0.21.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.clearspring.analytics#stream;2.7.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving io.dropwizard.metrics#metrics-core;3.1.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving io.dropwizard.metrics#metrics-jvm;3.1.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving io.dropwizard.metrics#metrics-json;3.1.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving io.dropwizard.metrics#metrics-graphite;3.1.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.fasterxml.jackson.core#jackson-databind;2.4.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.fasterxml.jackson.core#jackson-annotations;2.4.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.fasterxml.jackson.core#jackson-core;2.4.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.fasterxml.jackson.module#jackson-module-scala_2.10;2.4.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#scala-reflect;2.10.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.fasterxml.jackson.core#jackson-annotations;2.4.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.ivy#ivy;2.4.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving oro#oro;2.0.8 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.tachyonproject#tachyon-client;0.5.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.tachyonproject#tachyon;0.5.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving commons-io#commons-io;2.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.spark-project#pyrolite;2.0.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving net.sf.py4j#py4j;0.8.2.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server;2.2.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.spark#spark-sql_2.10;1.3.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.apache.spark#spark-catalyst_2.10;1.3.0 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#scala-compiler;2.10.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scalamacros#quasiquotes_2.10;2.0.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-column;1.6.0rc3 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-common;1.6.0rc3 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-encoding;1.6.0rc3 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-generator;1.6.0rc3 ...[0m M[2K[0m[[0minfo[0m] [0mResolving commons-codec#commons-codec;1.5 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-hadoop;1.6.0rc3 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-format;2.2.0-rc1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-jackson;1.6.0rc3 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jackson#jackson-mapper-asl;1.9.11 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jackson#jackson-core-asl;1.9.11 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.jodd#jodd-core;3.6.3 ...[0m M[2K[0m[[0minfo[0m] [0mResolving commons-net#commons-net;3.1 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.slf4j#slf4j-api;1.7.5 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scoverage#scalac-scoverage-runtime_2.10;1.0.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scoverage#scalac-scoverage-plugin_2.10;1.0.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#jline;2.10.4 ...[0m M[2K[0m[[0minfo[0m] [0mResolving org.fusesource.jansi#jansi;1.4 ...[0m [0m[[0minfo[0m] [0mDone updating.[0m [0m[[31merror[0m] [0m(geode-spark-connector/compile:[31mcompile[0m) Compilation failed[0m [0m[[31merror[0m] [0mTotal time: 60 s, completed Sep 22, 2016 4:20:29 PM[0m Build step 'Execute shell' marked build as failure Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Skipped archiving because build is not successful