See 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/3/display/redirect>

------------------------------------------
[...truncated 78.84 KB...]
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Not a valid TFRecord. Fewer than 12 
bytes.
        at 
org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
        at 
org.apache.beam.sdk.io.TFRecordIO$TFRecordCodec.read(TFRecordIO.java:631)
        at 
org.apache.beam.sdk.io.TFRecordIO$TFRecordSource$TFRecordReader.readNextRecord(TFRecordIO.java:526)
        at 
org.apache.beam.sdk.io.CompressedSource$CompressedReader.readNextRecord(CompressedSource.java:426)
        at 
org.apache.beam.sdk.io.FileBasedSource$FileBasedReader.advanceImpl(FileBasedSource.java:473)
        at 
org.apache.beam.sdk.io.OffsetBasedSource$OffsetBasedReader.advance(OffsetBasedSource.java:267)
        at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.advance(WorkerCustomSources.java:602)
        ... 14 more
java.io.IOException: Failed to advance reader of source: 
hdfs://146.148.62.9:9000/TEXTIO_IT__1521892046445-00001-of-00003.tfrecord range 
[0, 19333314)
        at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.advance(WorkerCustomSources.java:605)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation$SynchronizedReaderIterator.advance(ReadOperation.java:379)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:185)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:150)
        at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:74)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Invalid data
        at 
org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
        at 
org.apache.beam.sdk.io.TFRecordIO$TFRecordCodec.read(TFRecordIO.java:642)
        at 
org.apache.beam.sdk.io.TFRecordIO$TFRecordSource$TFRecordReader.readNextRecord(TFRecordIO.java:526)
        at 
org.apache.beam.sdk.io.CompressedSource$CompressedReader.readNextRecord(CompressedSource.java:426)
        at 
org.apache.beam.sdk.io.FileBasedSource$FileBasedReader.advanceImpl(FileBasedSource.java:473)
        at 
org.apache.beam.sdk.io.OffsetBasedSource$OffsetBasedReader.advance(OffsetBasedSource.java:267)
        at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.advance(WorkerCustomSources.java:602)
        ... 14 more
java.io.IOException: Failed to advance reader of source: 
hdfs://146.148.62.9:9000/TEXTIO_IT__1521892046445-00001-of-00003.tfrecord range 
[0, 19333314)
        at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.advance(WorkerCustomSources.java:605)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation$SynchronizedReaderIterator.advance(ReadOperation.java:379)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:185)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:150)
        at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:74)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Invalid data
        at 
org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
        at 
org.apache.beam.sdk.io.TFRecordIO$TFRecordCodec.read(TFRecordIO.java:642)
        at 
org.apache.beam.sdk.io.TFRecordIO$TFRecordSource$TFRecordReader.readNextRecord(TFRecordIO.java:526)
        at 
org.apache.beam.sdk.io.CompressedSource$CompressedReader.readNextRecord(CompressedSource.java:426)
        at 
org.apache.beam.sdk.io.FileBasedSource$FileBasedReader.advanceImpl(FileBasedSource.java:473)
        at 
org.apache.beam.sdk.io.OffsetBasedSource$OffsetBasedReader.advance(OffsetBasedSource.java:267)
        at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.advance(WorkerCustomSources.java:602)
        ... 14 more
java.io.IOException: Failed to advance reader of source: 
hdfs://146.148.62.9:9000/TEXTIO_IT__1521892046445-00000-of-00003.tfrecord range 
[0, 19222204)
        at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.advance(WorkerCustomSources.java:605)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation$SynchronizedReaderIterator.advance(ReadOperation.java:379)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:185)
        at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:150)
        at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:74)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
        at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Not a valid TFRecord. Fewer than 12 
bytes.
        at 
org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
        at 
org.apache.beam.sdk.io.TFRecordIO$TFRecordCodec.read(TFRecordIO.java:631)
        at 
org.apache.beam.sdk.io.TFRecordIO$TFRecordSource$TFRecordReader.readNextRecord(TFRecordIO.java:526)
        at 
org.apache.beam.sdk.io.CompressedSource$CompressedReader.readNextRecord(CompressedSource.java:426)
        at 
org.apache.beam.sdk.io.FileBasedSource$FileBasedReader.advanceImpl(FileBasedSource.java:473)
        at 
org.apache.beam.sdk.io.OffsetBasedSource$OffsetBasedReader.advance(OffsetBasedSource.java:267)
        at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.advance(WorkerCustomSources.java:602)
        ... 14 more
Workflow failed. Causes: S03:TFRecordIO.Read/Read+Transform bytes to 
strings/Map+Calculate hashcode/WithKeys/AddKeys/Map+Calculate 
hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate 
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate 
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate 
hashcode/Combine.perKey(Hashing)/GroupByKey/Write failed., A work item was 
attempted 4 times without success. Each time the worker eventually lost contact 
with the service. The work item was attempted on: 
  tfrecordioit0writethenrea-03240451-eb64-harness-3ctb,
  tfrecordioit0writethenrea-03240451-eb64-harness-h06g,
  tfrecordioit0writethenrea-03240451-eb64-harness-48v0,
  tfrecordioit0writethenrea-03240451-eb64-harness-3ctb
        at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
        at 
org.apache.beam.sdk.io.tfrecord.TFRecordIOIT.writeThenReadAll(TFRecordIOIT.java:127)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
        at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
        at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
        at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
        at org.junit.rules.RunRules.evaluate(RunRules.java:20)
        at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
        at 
org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   TFRecordIOIT.writeThenReadAll:127  Runtime java.io.IOException: 
Failed to adv...
[INFO] 
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] No dependency problems found
[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:verify (default) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Failsafe report directory: 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/runs/9fb5402c/beam/sdks/java/io/file-based-io-tests/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 07:27 min
[INFO] Finished at: 2018-03-24T11:54:34Z
[INFO] Final Memory: 91M/882M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on 
project beam-sdks-java-io-file-based-io-tests: There are test failures.
[ERROR] 
[ERROR] Please refer to 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/runs/9fb5402c/beam/sdks/java/io/file-based-io-tests/target/failsafe-reports>
 for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, 
[date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal 
org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on 
project beam-sdks-java-io-file-based-io-tests: There are test failures.

Please refer to 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/runs/9fb5402c/beam/sdks/java/io/file-based-io-tests/target/failsafe-reports>
 for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, 
[date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
(MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
(MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
(MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject 
(LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject 
(LifecycleModuleBuilder.java:81)
    at 
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
 (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute 
(LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke 
(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke 
(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced 
(Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch 
(Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode 
(Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main 
(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test 
failures.

Please refer to 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/runs/9fb5402c/beam/sdks/java/io/file-based-io-tests/target/failsafe-reports>
 for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, 
[date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.plugin.surefire.SurefireHelper.throwException 
(SurefireHelper.java:240)
    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution 
(SurefireHelper.java:112)
    at org.apache.maven.plugin.failsafe.VerifyMojo.execute (VerifyMojo.java:192)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo 
(DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
(MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
(MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
(MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject 
(LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject 
(LifecycleModuleBuilder.java:81)
    at 
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
 (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute 
(LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke 
(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke 
(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced 
(Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch 
(Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode 
(Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main 
(Launcher.java:356)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

STDERR: 
2018-03-24 11:54:34,402 9fb5402c MainThread beam_integration_benchmark(1/1) 
ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 624, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 527, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py";,>
 line 159, in Run
    job_type=job_type)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py";,>
 line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-24 11:54:34,403 9fb5402c MainThread beam_integration_benchmark(1/1) 
INFO     Cleaning up benchmark beam_integration_benchmark
2018-03-24 11:54:34,404 9fb5402c MainThread beam_integration_benchmark(1/1) 
INFO     Running: kubectl 
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/config-filebasedioithdfs-1521890089911>
 delete -f 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-24 11:54:34,858 9fb5402c MainThread beam_integration_benchmark(1/1) 
INFO     Running: kubectl 
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/config-filebasedioithdfs-1521890089911>
 delete -f 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-03-24 11:54:35,013 9fb5402c MainThread beam_integration_benchmark(1/1) 
ERROR    Exception running benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 758, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 624, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 527, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py";,>
 line 159, in Run
    job_type=job_type)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py";,>
 line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-24 11:54:35,014 9fb5402c MainThread beam_integration_benchmark(1/1) 
ERROR    Benchmark 1/1 beam_integration_benchmark (UID: 
beam_integration_benchmark0) failed. Execution will continue.
2018-03-24 11:54:35,014 9fb5402c MainThread beam_integration_benchmark(1/1) 
INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed 
Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                 
 
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-03-24 11:54:35,014 9fb5402c MainThread beam_integration_benchmark(1/1) 
INFO     Complete logs can be found at: 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/runs/9fb5402c/pkb.log>
2018-03-24 11:54:35,014 9fb5402c MainThread beam_integration_benchmark(1/1) 
INFO     Completion statuses can be found at: 
<https://builds.apache.org/job/beam_PerformanceTests_TFRecordIOIT_HDFS/ws/runs/9fb5402c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Reply via email to