See <https://builds.apache.org/job/Flume-trunk-hbase-1/188/>
------------------------------------------ Started by user mpercy [EnvInject] - Loading node environment variables. Building remotely on H11 (docker Ubuntu ubuntu yahoo-not-h2) in workspace <https://builds.apache.org/job/Flume-trunk-hbase-1/ws/> > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url > http://git-wip-us.apache.org/repos/asf/flume.git # timeout=10 Fetching upstream changes from http://git-wip-us.apache.org/repos/asf/flume.git > git --version # timeout=10 > git -c core.askpass=true fetch --tags --progress > http://git-wip-us.apache.org/repos/asf/flume.git > +refs/heads/*:refs/remotes/origin/* > git rev-parse origin/trunk^{commit} # timeout=10 Checking out Revision 988ede948ffaf6526c226323a6808922f38b625c (origin/trunk) > git config core.sparsecheckout # timeout=10 > git checkout -f 988ede948ffaf6526c226323a6808922f38b625c > git rev-list 988ede948ffaf6526c226323a6808922f38b625c # timeout=10 [Flume-trunk-hbase-1] $ /bin/bash -xe /tmp/hudson1645740365635198175.sh + OLD_ES_DATA=<https://builds.apache.org/job/Flume-trunk-hbase-1/ws/flume-ng-sinks/flume-ng-elasticsearch-sink/data> + '[' -d <https://builds.apache.org/job/Flume-trunk-hbase-1/ws/flume-ng-sinks/flume-ng-elasticsearch-sink/data> ']' + OLD_HIVE_DATA=<https://builds.apache.org/job/Flume-trunk-hbase-1/ws/flume-ng-sinks/flume-hive-sink/metastore_db> + '[' -d <https://builds.apache.org/job/Flume-trunk-hbase-1/ws/flume-ng-sinks/flume-hive-sink/metastore_db> ']' Parsing POMs Established TCP socket on 41171 maven32-agent.jar already up to date maven32-interceptor.jar already up to date maven3-interceptor-commons.jar already up to date [Flume-trunk-hbase-1] $ /home/jenkins/tools/java/latest1.8/bin/java -Xms512m -Xmx1024m -XX:PermSize=256m -XX:MaxPermSize=512m -Xrs -cp /home/jenkins/jenkins-slave/maven32-agent.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3/boot/plexus-classworlds-2.5.2.jar:/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3/conf/logging jenkins.maven3.agent.Maven32Main /home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3 /home/jenkins/jenkins-slave/slave.jar /home/jenkins/jenkins-slave/maven32-interceptor.jar /home/jenkins/jenkins-slave/maven3-interceptor-commons.jar 41171 Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=256m; support was removed in 8.0 Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0 Exception in thread "main" java.lang.ClassNotFoundException: hudson.remoting.Launcher at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50) at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:271) at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:247) at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:239) at jenkins.maven3.agent.Maven32Main.main(Maven32Main.java:143) at jenkins.maven3.agent.Maven32Main.main(Maven32Main.java:74) ERROR: Failed to parse POMs java.io.EOFException: unexpected stream termination at hudson.remoting.ChannelBuilder.negotiate(ChannelBuilder.java:365) at hudson.remoting.ChannelBuilder.build(ChannelBuilder.java:310) at hudson.slaves.Channels.forProcess(Channels.java:115) at hudson.maven.AbstractMavenProcessFactory.newProcess(AbstractMavenProcessFactory.java:297) at hudson.maven.ProcessCache.get(ProcessCache.java:236) at hudson.maven.MavenModuleSetBuild$MavenModuleSetBuildExecution.doRun(MavenModuleSetBuild.java:778) at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:534) at hudson.model.Run.execute(Run.java:1738) at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:531) at hudson.model.ResourceController.execute(ResourceController.java:98) at hudson.model.Executor.run(Executor.java:410) [WARNINGS] Skipping publisher since build result is FAILURE Archiving artifacts
