See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/669/display/redirect?page=changes>
Changes: [szewinho] [BEAM-3214] Add integration test for HBaseIO. ------------------------------------------ [...truncated 26.96 MB...] [Executor task launch worker for task 443] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 547.0 (TID 443) [Executor task launch worker for task 442] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 547.0 (TID 442) [Executor task launch worker for task 444] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 547.0 (TID 444) [Executor task launch worker for task 441] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 443] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 444] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 443] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 441] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 444] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 441] INFO org.apache.spark.storage.BlockManager - Found block rdd_2441_0 locally [Executor task launch worker for task 443] INFO org.apache.spark.storage.BlockManager - Found block rdd_2441_2 locally [Executor task launch worker for task 444] INFO org.apache.spark.storage.BlockManager - Found block rdd_2441_3 locally [Executor task launch worker for task 444] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2756_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 443] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2756_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_2 in memory on localhost:43745 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 441] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2756_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_3 in memory on localhost:43745 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_0 in memory on localhost:43745 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 442] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 442] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 442] INFO org.apache.spark.storage.BlockManager - Found block rdd_2441_1 locally [Executor task launch worker for task 442] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2756_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_1 in memory on localhost:43745 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 441] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 547.0 (TID 441). 59881 bytes result sent to driver [Executor task launch worker for task 444] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 547.0 (TID 444). 59881 bytes result sent to driver [Executor task launch worker for task 443] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 547.0 (TID 443). 59881 bytes result sent to driver [Executor task launch worker for task 442] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 547.0 (TID 442). 59881 bytes result sent to driver [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 547.0 (TID 444) in 16 ms on localhost (executor driver) (1/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 547.0 (TID 441) in 17 ms on localhost (executor driver) (2/4) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 547.0 (TID 443) in 17 ms on localhost (executor driver) (3/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 547.0 (TID 442) in 18 ms on localhost (executor driver) (4/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 547.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 547 (foreach at UnboundedDataset.java:81) finished in 0.032 s [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 34 finished: foreach at UnboundedDataset.java:81, took 0.126339 s [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1528316053500 ms.2 from job set of time 1528316053500 ms [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Starting job streaming job 1528316053500 ms.3 from job set of time 1528316053500 ms [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting job: foreach at UnboundedDataset.java:81 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:59) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:59) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 35 (foreach at UnboundedDataset.java:81) with 4 output partitions [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 575 (foreach at UnboundedDataset.java:81) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 574, ShuffleMapStage 571, ShuffleMapStage 560, ShuffleMapStage 561, ShuffleMapStage 554, ShuffleMapStage 569, ShuffleMapStage 566, ShuffleMapStage 570, ShuffleMapStage 559) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 574, ShuffleMapStage 569) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 568 (MapPartitionsRDD[2472] at mapToPair at GroupCombineFunctions.java:59), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_103 stored as values in memory (estimated size 137.2 KB, free 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_103_piece0 stored as bytes in memory (estimated size 31.0 KB, free 13.5 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_103_piece0 in memory on localhost:43745 (size: 31.0 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 103 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 2 missing tasks from ShuffleMapStage 568 (MapPartitionsRDD[2472] at mapToPair at GroupCombineFunctions.java:59) (first 15 tasks are for partitions Vector(0, 2)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 568.0 with 2 tasks [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 568.0 (TID 445, localhost, executor driver, partition 0, PROCESS_LOCAL, 8237 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 568.0 (TID 446, localhost, executor driver, partition 2, PROCESS_LOCAL, 8237 bytes) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 573 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:59), which has no missing parents [Executor task launch worker for task 445] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 568.0 (TID 445) [Executor task launch worker for task 446] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 568.0 (TID 446) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_104 stored as values in memory (estimated size 139.5 KB, free 13.5 GB) [Executor task launch worker for task 446] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_2 locally [Executor task launch worker for task 445] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_0 locally [Executor task launch worker for task 445] ERROR org.apache.spark.executor.Executor - Exception in task 0.0 in stage 568.0 (TID 445) [Executor task launch worker for task 446] ERROR org.apache.spark.executor.Executor - Exception in task 1.0 in stage 568.0 (TID 446) java.io.FileNotFoundException: /tmp/blockmgr-da76bc6f-bdc0-4167-88b2-d5707bc88ece/26/shuffle_71_0_0.index.1bac79d0-7a96-45f9-8fc6-15bfca889676 (No such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:162) at org.apache.spark.shuffle.IndexShuffleBlockResolver.writeIndexFileAndCommit(IndexShuffleBlockResolver.scala:144) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:127) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.io.FileNotFoundException: /tmp/blockmgr-da76bc6f-bdc0-4167-88b2-d5707bc88ece/18/shuffle_71_2_0.index.1c1f4de3-a7f4-4761-980e-bca9a3121610 (No such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:162) at org.apache.spark.shuffle.IndexShuffleBlockResolver.writeIndexFileAndCommit(IndexShuffleBlockResolver.scala:144) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:127) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_104_piece0 stored as bytes in memory (estimated size 31.6 KB, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_104_piece0 in memory on localhost:43745 (size: 31.6 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 104 from broadcast at DAGScheduler.scala:1039 [task-result-getter-0] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 0.0 in stage 568.0 (TID 445, localhost, executor driver): java.io.FileNotFoundException: /tmp/blockmgr-da76bc6f-bdc0-4167-88b2-d5707bc88ece/26/shuffle_71_0_0.index.1bac79d0-7a96-45f9-8fc6-15bfca889676 (No such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:162) at org.apache.spark.shuffle.IndexShuffleBlockResolver.writeIndexFileAndCommit(IndexShuffleBlockResolver.scala:144) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:127) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [task-result-getter-0] ERROR org.apache.spark.scheduler.TaskSetManager - Task 0 in stage 568.0 failed 1 times; aborting job [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 568.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 573 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:59) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 573.0 with 4 tasks [task-result-getter-2] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 1.0 in stage 568.0 (TID 446, localhost, executor driver): java.io.FileNotFoundException: /tmp/blockmgr-da76bc6f-bdc0-4167-88b2-d5707bc88ece/18/shuffle_71_2_0.index.1c1f4de3-a7f4-4761-980e-bca9a3121610 (No such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:162) at org.apache.spark.shuffle.IndexShuffleBlockResolver.writeIndexFileAndCommit(IndexShuffleBlockResolver.scala:144) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:127) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 568.0, whose tasks have all completed, from pool [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 573.0 (TID 447, localhost, executor driver, partition 0, PROCESS_LOCAL, 8297 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 573.0 (TID 448, localhost, executor driver, partition 1, PROCESS_LOCAL, 8297 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 573.0 (TID 449, localhost, executor driver, partition 2, PROCESS_LOCAL, 8297 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 573.0 (TID 450, localhost, executor driver, partition 3, PROCESS_LOCAL, 8297 bytes) [Executor task launch worker for task 447] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 573.0 (TID 447) [Executor task launch worker for task 448] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 573.0 (TID 448) [Executor task launch worker for task 449] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 573.0 (TID 449) [Executor task launch worker for task 450] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 573.0 (TID 450) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Cancelling stage 573 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Stage 573 was cancelled [dispatcher-event-loop-3] INFO org.apache.spark.executor.Executor - Executor is trying to kill task 1.0 in stage 573.0 (TID 448), reason: Stage cancelled [dispatcher-event-loop-3] INFO org.apache.spark.executor.Executor - Executor is trying to kill task 2.0 in stage 573.0 (TID 449), reason: Stage cancelled [dispatcher-event-loop-3] INFO org.apache.spark.executor.Executor - Executor is trying to kill task 3.0 in stage 573.0 (TID 450), reason: Stage cancelled [dispatcher-event-loop-3] INFO org.apache.spark.executor.Executor - Executor is trying to kill task 0.0 in stage 573.0 (TID 447), reason: Stage cancelled [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 573 (mapToPair at GroupCombineFunctions.java:59) failed in 0.009 s due to Job aborted due to stage failure: Task 0 in stage 568.0 failed 1 times, most recent failure: Lost task 0.0 in stage 568.0 (TID 445, localhost, executor driver): java.io.FileNotFoundException: /tmp/blockmgr-da76bc6f-bdc0-4167-88b2-d5707bc88ece/26/shuffle_71_0_0.index.1bac79d0-7a96-45f9-8fc6-15bfca889676 (No such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:162) at org.apache.spark.shuffle.IndexShuffleBlockResolver.writeIndexFileAndCommit(IndexShuffleBlockResolver.scala:144) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:127) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Driver stacktrace: [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Cancelling stage 568 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 568 (mapToPair at GroupCombineFunctions.java:59) failed in 0.020 s due to Job aborted due to stage failure: Task 0 in stage 568.0 failed 1 times, most recent failure: Lost task 0.0 in stage 568.0 (TID 445, localhost, executor driver): java.io.FileNotFoundException: /tmp/blockmgr-da76bc6f-bdc0-4167-88b2-d5707bc88ece/26/shuffle_71_0_0.index.1bac79d0-7a96-45f9-8fc6-15bfca889676 (No such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:162) at org.apache.spark.shuffle.IndexShuffleBlockResolver.writeIndexFileAndCommit(IndexShuffleBlockResolver.scala:144) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:127) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Driver stacktrace: [Executor task launch worker for task 450] INFO org.apache.spark.executor.Executor - Executor killed task 3.0 in stage 573.0 (TID 450), reason: Stage cancelled [task-result-getter-1] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 3.0 in stage 573.0 (TID 450, localhost, executor driver): TaskKilled (Stage cancelled) [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 failed: foreach at UnboundedDataset.java:81, took 0.027860 s [Executor task launch worker for task 447] INFO org.apache.spark.executor.Executor - Executor killed task 0.0 in stage 573.0 (TID 447), reason: Stage cancelled [task-result-getter-3] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 0.0 in stage 573.0 (TID 447, localhost, executor driver): TaskKilled (Stage cancelled) [Executor task launch worker for task 448] INFO org.apache.spark.executor.Executor - Executor killed task 1.0 in stage 573.0 (TID 448), reason: Stage cancelled [Executor task launch worker for task 449] INFO org.apache.spark.executor.Executor - Executor killed task 2.0 in stage 573.0 (TID 449), reason: Stage cancelled [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1528316053500 ms.3 from job set of time 1528316053500 ms [task-result-getter-0] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 1.0 in stage 573.0 (TID 448, localhost, executor driver): TaskKilled (Stage cancelled) [task-result-getter-2] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 2.0 in stage 573.0 (TID 449, localhost, executor driver): TaskKilled (Stage cancelled) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 573.0, whose tasks have all completed, from pool [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@6e43492e{/streaming,null,UNAVAILABLE,@Spark} [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@1635feb5{/streaming/batch,null,UNAVAILABLE,@Spark} [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@6ecbd2ed{/static/streaming,null,UNAVAILABLE,@Spark} [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@47ad551e{HTTP/1.1,[http/1.1]}{127.0.0.1:4040} [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040 [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped! [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped! [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext Gradle Test Executor 277 finished executing tests. > Task :beam-runners-spark:validatesRunnerStreaming FAILED [Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-7db6ebf4-dacb-41af-99bd-0a93a8f65338 org.apache.beam.runners.spark.translation.streaming.CreateStreamTest > testDiscardingMode FAILED org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.FileNotFoundException: /tmp/blockmgr-da76bc6f-bdc0-4167-88b2-d5707bc88ece/16/shuffle_75_0_0.index.49d58328-05f9-4659-b8ac-64285d95d806 (No such file or directory) at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:68) at org.apache.beam.runners.spark.SparkPipelineResult.access$000(SparkPipelineResult.java:41) at org.apache.beam.runners.spark.SparkPipelineResult$StreamingMode.stop(SparkPipelineResult.java:163) at org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:125) at org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:348) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:329) at org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testDiscardingMode(CreateStreamTest.java:203) Caused by: java.io.FileNotFoundException: /tmp/blockmgr-da76bc6f-bdc0-4167-88b2-d5707bc88ece/16/shuffle_75_0_0.index.49d58328-05f9-4659-b8ac-64285d95d806 (No such file or directory) 13 tests completed, 1 failed Finished generating test XML results (0.094 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming> Generating HTML test report... Finished generating test html results (0.091 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming> :beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 1 mins 18.854 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':beam-runners-spark:validatesRunnerStreaming'. > There were failing tests. See the report at: > file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0. See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 7m 11s 39 actionable tasks: 35 executed, 4 from cache Publishing build scan... https://gradle.com/s/5bkyxlz6vlqka Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure Recording test results Not sending mail to unregistered user szewi...@gmail.com