See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2181/display/redirect>
------------------------------------------ [...truncated 29.72 MB...] [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2472 (mapToPair at GroupCombineFunctions.java:57) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2500 (mapToPair at GroupCombineFunctions.java:57) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2509 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 31 (foreach at UnboundedDataset.java:79) with 4 output partitions [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 667 (foreach at UnboundedDataset.java:79) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 658, ShuffleMapStage 662, ShuffleMapStage 651, ShuffleMapStage 666, ShuffleMapStage 660, ShuffleMapStage 637, ShuffleMapStage 664, ShuffleMapStage 653) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 658) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 656 (MapPartitionsRDD[2472] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_127 stored as values in memory (estimated size 177.5 KB, free 13.4 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_127_piece0 stored as bytes in memory (estimated size 54.7 KB, free 13.4 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_127_piece0 in memory on localhost:33319 (size: 54.7 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 127 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 656 (MapPartitionsRDD[2472] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 656.0 with 4 tasks [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 656.0 (TID 570, localhost, executor driver, partition 0, PROCESS_LOCAL, 8127 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 656.0 (TID 571, localhost, executor driver, partition 1, PROCESS_LOCAL, 8127 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 656.0 (TID 572, localhost, executor driver, partition 2, PROCESS_LOCAL, 8127 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 656.0 (TID 573, localhost, executor driver, partition 3, PROCESS_LOCAL, 8127 bytes) [Executor task launch worker for task 571] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 656.0 (TID 571) [Executor task launch worker for task 572] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 656.0 (TID 572) [Executor task launch worker for task 570] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 656.0 (TID 570) [Executor task launch worker for task 573] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 656.0 (TID 573) [Executor task launch worker for task 571] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_1 locally [Executor task launch worker for task 573] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_3 locally [Executor task launch worker for task 572] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_2 locally [Executor task launch worker for task 570] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_0 locally [Executor task launch worker for task 571] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 656.0 (TID 571). 59509 bytes result sent to driver [Executor task launch worker for task 573] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 656.0 (TID 573). 59509 bytes result sent to driver [Executor task launch worker for task 572] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 656.0 (TID 572). 59509 bytes result sent to driver [Executor task launch worker for task 570] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 656.0 (TID 570). 59509 bytes result sent to driver [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 656.0 (TID 571) in 13 ms on localhost (executor driver) (1/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 656.0 (TID 573) in 13 ms on localhost (executor driver) (2/4) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 656.0 (TID 572) in 13 ms on localhost (executor driver) (3/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 656.0 (TID 570) in 14 ms on localhost (executor driver) (4/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 656.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 656 (mapToPair at GroupCombineFunctions.java:57) finished in 0.022 s [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 658, ResultStage 667, ShuffleMapStage 657) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 657 (MapPartitionsRDD[2500] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_128 stored as values in memory (estimated size 211.7 KB, free 13.4 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_128_piece0 stored as bytes in memory (estimated size 63.3 KB, free 13.4 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_128_piece0 in memory on localhost:33319 (size: 63.3 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 128 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 657 (MapPartitionsRDD[2500] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 657.0 with 5 tasks [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 657.0 (TID 574, localhost, executor driver, partition 0, PROCESS_LOCAL, 8376 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 657.0 (TID 575, localhost, executor driver, partition 1, PROCESS_LOCAL, 8376 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 657.0 (TID 576, localhost, executor driver, partition 2, PROCESS_LOCAL, 8376 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 657.0 (TID 577, localhost, executor driver, partition 3, PROCESS_LOCAL, 8376 bytes) [Executor task launch worker for task 577] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 657.0 (TID 577) [Executor task launch worker for task 575] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 657.0 (TID 575) [Executor task launch worker for task 574] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 657.0 (TID 574) [Executor task launch worker for task 576] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 657.0 (TID 576) [Executor task launch worker for task 577] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 576] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 577] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 576] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 576] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_2 locally [Executor task launch worker for task 577] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_3 locally [Executor task launch worker for task 577] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_3 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB) [Executor task launch worker for task 576] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_2 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_3 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_2 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 575] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 575] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 574] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 574] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 574] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_0 locally [Executor task launch worker for task 575] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_1 locally [Executor task launch worker for task 575] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_1 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB) [Executor task launch worker for task 574] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_0 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_1 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_0 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 577] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 657.0 (TID 577). 59940 bytes result sent to driver [Executor task launch worker for task 576] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 657.0 (TID 576). 59940 bytes result sent to driver [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 657.0 (TID 578, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes) [Executor task launch worker for task 578] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 657.0 (TID 578) [Executor task launch worker for task 574] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 657.0 (TID 574). 59940 bytes result sent to driver [Executor task launch worker for task 575] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 657.0 (TID 575). 59940 bytes result sent to driver [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 657.0 (TID 577) in 16 ms on localhost (executor driver) (1/5) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 657.0 (TID 576) in 16 ms on localhost (executor driver) (2/5) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 657.0 (TID 574) in 17 ms on localhost (executor driver) (3/5) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 657.0 (TID 575) in 18 ms on localhost (executor driver) (4/5) [Executor task launch worker for task 578] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 657.0 (TID 578). 59467 bytes result sent to driver [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 657.0 (TID 578) in 16 ms on localhost (executor driver) (5/5) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 657.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 657 (mapToPair at GroupCombineFunctions.java:57) finished in 0.039 s [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 658, ResultStage 667) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 658 (MapPartitionsRDD[2509] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_129 stored as values in memory (estimated size 213.0 KB, free 13.4 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_129_piece0 stored as bytes in memory (estimated size 63.2 KB, free 13.4 GB) [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_129_piece0 in memory on localhost:33319 (size: 63.2 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 129 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 658 (MapPartitionsRDD[2509] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 658.0 with 5 tasks [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 658.0 (TID 579, localhost, executor driver, partition 0, PROCESS_LOCAL, 7638 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 658.0 (TID 580, localhost, executor driver, partition 1, PROCESS_LOCAL, 7638 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 658.0 (TID 581, localhost, executor driver, partition 2, PROCESS_LOCAL, 7638 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 658.0 (TID 582, localhost, executor driver, partition 3, PROCESS_LOCAL, 7638 bytes) [Executor task launch worker for task 579] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 658.0 (TID 579) [Executor task launch worker for task 580] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 658.0 (TID 580) [Executor task launch worker for task 581] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 658.0 (TID 581) [Executor task launch worker for task 582] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 658.0 (TID 582) [Executor task launch worker for task 579] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 582] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 579] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 582] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 580] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 581] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 580] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 581] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 582] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 658.0 (TID 582). 59896 bytes result sent to driver [Executor task launch worker for task 579] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 658.0 (TID 579). 59896 bytes result sent to driver [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 658.0 (TID 583, localhost, executor driver, partition 4, PROCESS_LOCAL, 7638 bytes) [Executor task launch worker for task 583] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 658.0 (TID 583) [Executor task launch worker for task 580] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 658.0 (TID 580). 59896 bytes result sent to driver [Executor task launch worker for task 581] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 658.0 (TID 581). 59896 bytes result sent to driver [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 658.0 (TID 579) in 14 ms on localhost (executor driver) (1/5) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 658.0 (TID 582) in 14 ms on localhost (executor driver) (2/5) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 658.0 (TID 580) in 15 ms on localhost (executor driver) (3/5) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 658.0 (TID 581) in 15 ms on localhost (executor driver) (4/5) [Executor task launch worker for task 583] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 583] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 583] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 658.0 (TID 583). 59853 bytes result sent to driver [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 658.0 (TID 583) in 13 ms on localhost (executor driver) (5/5) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 658.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 658 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.034 s [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 667) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 667 (MapPartitionsRDD[2529] at map at TranslationUtils.java:128), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_130 stored as values in memory (estimated size 187.9 KB, free 13.4 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_130_piece0 stored as bytes in memory (estimated size 57.8 KB, free 13.4 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_130_piece0 in memory on localhost:33319 (size: 57.8 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 130 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 667 (MapPartitionsRDD[2529] at map at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 667.0 with 4 tasks [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 667.0 (TID 584, localhost, executor driver, partition 0, PROCESS_LOCAL, 8094 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 667.0 (TID 585, localhost, executor driver, partition 1, PROCESS_LOCAL, 8094 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 667.0 (TID 586, localhost, executor driver, partition 2, PROCESS_LOCAL, 8094 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 667.0 (TID 587, localhost, executor driver, partition 3, PROCESS_LOCAL, 8094 bytes) [Executor task launch worker for task 586] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 667.0 (TID 586) [Executor task launch worker for task 585] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 667.0 (TID 585) [Executor task launch worker for task 587] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 667.0 (TID 587) [Executor task launch worker for task 584] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 667.0 (TID 584) [Executor task launch worker for task 587] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 584] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 584] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 587] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 587] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_3 locally [Executor task launch worker for task 584] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_0 locally [Executor task launch worker for task 584] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_0 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB) [Executor task launch worker for task 587] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_3 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB) [Executor task launch worker for task 586] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 585] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 586] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 585] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_3 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 586] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_2 locally [Executor task launch worker for task 585] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_1 locally [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_0 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 585] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_1 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB) [Executor task launch worker for task 586] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_2 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_2 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_1 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 587] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 667.0 (TID 587). 59881 bytes result sent to driver [Executor task launch worker for task 584] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 667.0 (TID 584). 59881 bytes result sent to driver [Executor task launch worker for task 586] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 667.0 (TID 586). 59881 bytes result sent to driver [Executor task launch worker for task 585] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 667.0 (TID 585). 59881 bytes result sent to driver [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 667.0 (TID 584) in 14 ms on localhost (executor driver) (1/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 667.0 (TID 587) in 14 ms on localhost (executor driver) (2/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 667.0 (TID 585) in 15 ms on localhost (executor driver) (3/4) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 667.0 (TID 586) in 15 ms on localhost (executor driver) (4/4) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 667.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 667 (foreach at UnboundedDataset.java:79) finished in 0.024 s [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 31 finished: foreach at UnboundedDataset.java:79, took 0.127969 s [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1542067610500 ms.3 from job set of time 1542067610500 ms [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 7.188 s for time 1542067610500 ms (execution: 0.540 s) [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@3fd82786{/streaming,null,UNAVAILABLE,@Spark} [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@57c784fc{/streaming/batch,null,UNAVAILABLE,@Spark} [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@5f05e8ae{/static/streaming,null,UNAVAILABLE,@Spark} [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@1a70216f{HTTP/1.1,[http/1.1]}{127.0.0.1:4040} [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040 [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped! [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped! [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext Gradle Test Executor 289 finished executing tests. > Task :beam-runners-spark:validatesRunnerStreaming [Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-cd34cceb-8cf0-4168-afcb-d53a903a9e82 org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@15cde176{HTTP/1.1,[http/1.1]}{127.0.0.1:4041} [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041 [dispatcher-event-loop-2] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped! [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped! [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext Gradle Test Executor 293 finished executing tests. > Task :beam-runners-spark:validatesRunnerStreaming [Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-2c9064c1-82f3-4ab4-bdd0-0fb09fa77c8c Finished generating test XML results (0.165 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming> Generating HTML test report... Finished generating test html results (0.196 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming> Packing task ':beam-runners-spark:validatesRunnerStreaming' Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin :beam-runners-spark:validatesRunnerStreaming (Thread[Daemon worker,5,main]) completed. Took 10 mins 28.155 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':beam-runners-spark:validatesRunnerBatch'. > There were failing tests. See the report at: > file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 16m 5s 43 actionable tasks: 39 executed, 4 from cache Publishing build scan... https://gradle.com/s/q5zum2na76mmy Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org