See
<https://builds.apache.org/job/kafka-trunk-jdk11/1480/display/redirect?page=changes>
Changes:
[github] KAFKA-9859 / kafka-streams-application-reset tool doesn't take into
------------------------------------------
[...truncated 3.04 MB...]
> Task :streams:upgrade-system-tests-11:testClasses
> Task :streams:streams-scala:spotbugsMain
> Task :streams:upgrade-system-tests-11:checkstyleTest
> Task :streams:test-utils:spotbugsMain
> Task :streams:streams-scala:test
org.apache.kafka.streams.scala.kstream.SuppressedTest >
Suppressed.untilWindowCloses should produce the correct suppression STARTED
org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced should
create a Produced with Serdes STARTED
org.apache.kafka.streams.scala.kstream.SuppressedTest >
Suppressed.untilWindowCloses should produce the correct suppression PASSED
org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced should
create a Produced with Serdes PASSED
org.apache.kafka.streams.scala.kstream.SuppressedTest >
Suppressed.untilTimeLimit should produce the correct suppression STARTED
org.apache.kafka.streams.scala.kstream.SuppressedTest >
Suppressed.untilTimeLimit should produce the correct suppression PASSED
org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.maxRecords
should produce the correct buffer config STARTED
org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.maxRecords
should produce the correct buffer config PASSED
org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.maxBytes
should produce the correct buffer config STARTED
org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.maxBytes
should produce the correct buffer config PASSED
org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.unbounded
should produce the correct buffer config STARTED
org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.unbounded
should produce the correct buffer config PASSED
org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig should
support very long chains of factory methods STARTED
org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig should
support very long chains of factory methods PASSED
org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced with
timestampExtractor and resetPolicy should create a Consumed with Serdes,
timestampExtractor and resetPolicy STARTED
org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced with
timestampExtractor and resetPolicy should create a Consumed with Serdes,
timestampExtractor and resetPolicy PASSED
org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should
create a Joined with Serdes STARTED
org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should
create a Joined with Serdes PASSED
org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should
create a Joined with Serdes and repartition topic name STARTED
org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should
create a Joined with Serdes and repartition topic name PASSED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialized
should create a Materialized with Serdes STARTED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialized
should create a Materialized with Serdes PASSED
org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed should
create a Consumed with Serdes STARTED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize
with a store name should create a Materialized with Serdes and a store name
STARTED
org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed should
create a Consumed with Serdes PASSED
org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with
timestampExtractor and resetPolicy should create a Consumed with Serdes,
timestampExtractor and resetPolicy STARTED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize
with a store name should create a Materialized with Serdes and a store name
PASSED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize
with a window store supplier should create a Materialized with Serdes and a
store supplier STARTED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize
with a window store supplier should create a Materialized with Serdes and a
store supplier PASSED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize
with a key value store supplier should create a Materialized with Serdes and a
store supplier STARTED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize
with a key value store supplier should create a Materialized with Serdes and a
store supplier PASSED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize
with a session store supplier should create a Materialized with Serdes and a
store supplier STARTED
org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize
with a session store supplier should create a Materialized with Serdes and a
store supplier PASSED
org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped should
create a Grouped with Serdes STARTED
org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped should
create a Grouped with Serdes PASSED
org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped with
repartition topic name should create a Grouped with Serdes, and repartition
topic name STARTED
org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped with
repartition topic name should create a Grouped with Serdes, and repartition
topic name PASSED
org.apache.kafka.streams.scala.kstream.KTableTest > filter a KTable should
filter records satisfying the predicate STARTED
org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with
timestampExtractor and resetPolicy should create a Consumed with Serdes,
timestampExtractor and resetPolicy PASSED
org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with
timestampExtractor should create a Consumed with Serdes and timestampExtractor
STARTED
org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with
timestampExtractor should create a Consumed with Serdes and timestampExtractor
PASSED
org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with
resetPolicy should create a Consumed with Serdes and resetPolicy STARTED
org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with
resetPolicy should create a Consumed with Serdes and resetPolicy PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > filter a KStream should
filter records satisfying the predicate STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > filter a KStream should
filter records satisfying the predicate PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > filterNot a KStream should
filter records not satisfying the predicate STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > filterNot a KStream should
filter records not satisfying the predicate PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > foreach a KStream should
run foreach actions on records STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > foreach a KStream should
run foreach actions on records PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > peek a KStream should run
peek actions on records STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > peek a KStream should run
peek actions on records PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > selectKey a KStream should
select a new key STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > selectKey a KStream should
select a new key PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreams should
join correctly records STARTED
org.apache.kafka.streams.scala.kstream.KTableTest > filter a KTable should
filter records satisfying the predicate PASSED
org.apache.kafka.streams.scala.kstream.KTableTest > filterNot a KTable should
filter records not satisfying the predicate STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreams should
join correctly records PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > transform a KStream should
transform correctly records STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > transform a KStream should
transform correctly records PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > flatTransform a KStream
should flatTransform correctly records STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > flatTransform a KStream
should flatTransform correctly records PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > flatTransformValues a
KStream should correctly flatTransform values in records STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > flatTransformValues a
KStream should correctly flatTransform values in records PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > flatTransformValues with
key in a KStream should correctly flatTransformValues in records STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > flatTransformValues with
key in a KStream should correctly flatTransformValues in records PASSED
org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreamToTables
should join correctly records STARTED
org.apache.kafka.streams.scala.kstream.KTableTest > filterNot a KTable should
filter records not satisfying the predicate PASSED
org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables should join
correctly records STARTED
org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreamToTables
should join correctly records PASSED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaJoin STARTED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaJoin PASSED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaCogroup STARTED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaCogroup PASSED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaSimple STARTED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaSimple PASSED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaCogroupSimple STARTED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaCogroupSimple PASSED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaAggregate STARTED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaAggregate PASSED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaProperties STARTED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaProperties PASSED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaTransform STARTED
org.apache.kafka.streams.scala.TopologyTest >
shouldBuildIdenticalTopologyInJavaNScalaTransform PASSED
org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsMaterialized
STARTED
org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables should join
correctly records PASSED
org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables with a
Materialized should join correctly records and state store STARTED
org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables with a
Materialized should join correctly records and state store PASSED
org.apache.kafka.streams.scala.kstream.KTableTest > windowed KTable#suppress
should correctly suppress results using Suppressed.untilTimeLimit STARTED
FATAL: command execution failed
java.io.EOFException
at
java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2681)
at
java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:3156)
at
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:862)
at java.io.ObjectInputStream.<init>(ObjectInputStream.java:358)
at
hudson.remoting.ObjectInputStreamEx.<init>(ObjectInputStreamEx.java:49)
at hudson.remoting.Command.readFrom(Command.java:140)
at hudson.remoting.Command.readFrom(Command.java:126)
at
hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:36)
at
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:63)
Caused: java.io.IOException: Unexpected termination of the channel
at
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:77)
Caused: java.io.IOException: Backing channel 'H30' is disconnected.
at
hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:214)
at
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
at com.sun.proxy.$Proxy141.isAlive(Unknown Source)
at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1150)
at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1142)
at hudson.tasks.CommandInterpreter.join(CommandInterpreter.java:155)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:109)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:66)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
at hudson.model.Build$BuildExecution.build(Build.java:206)
at hudson.model.Build$BuildExecution.doRun(Build.java:163)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
at hudson.model.Run.execute(Run.java:1815)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
FATAL: Unable to delete script file /tmp/jenkins6264076022663478920.sh
java.io.EOFException
at
java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2681)
at
java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:3156)
at
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:862)
at java.io.ObjectInputStream.<init>(ObjectInputStream.java:358)
at
hudson.remoting.ObjectInputStreamEx.<init>(ObjectInputStreamEx.java:49)
at hudson.remoting.Command.readFrom(Command.java:140)
at hudson.remoting.Command.readFrom(Command.java:126)
at
hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:36)
at
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:63)
Caused: java.io.IOException: Unexpected termination of the channel
at
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:77)
Caused: hudson.remoting.ChannelClosedException: Channel "unknown": Remote call
on H30 failed. The channel is closing down or has closed down
at hudson.remoting.Channel.call(Channel.java:950)
at hudson.FilePath.act(FilePath.java:1072)
at hudson.FilePath.act(FilePath.java:1061)
at hudson.FilePath.delete(FilePath.java:1542)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:123)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:66)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
at hudson.model.Build$BuildExecution.build(Build.java:206)
at hudson.model.Build$BuildExecution.doRun(Build.java:163)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
at hudson.model.Run.execute(Run.java:1815)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Build step 'Execute shell' marked build as failure
ERROR: Step ?Publish JUnit test result report? failed: no workspace for
kafka-trunk-jdk11 #1480
ERROR: H30 is offline; cannot locate JDK 11 (latest)
ERROR: No tool found matching GRADLE_4_10_2_HOME
ERROR: H30 is offline; cannot locate Gradle 4.10.3
Setting GRADLE_4_10_3_HOME=/home/jenkins/tools/gradle/4.10.3
Not sending mail to unregistered user [email protected]