See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2183/display/redirect?page=changes>
Changes: [github] Clarify in docstrings that we expect TFRecord values to be bytes ------------------------------------------ [...truncated 29.36 MB...] [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting job: foreach at UnboundedDataset.java:79 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:57) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2824 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 828, ShuffleMapStage 832, ShuffleMapStage 826, ShuffleMapStage 836, ShuffleMapStage 830, ShuffleMapStage 834, ShuffleMapStage 838, ShuffleMapStage 821, ShuffleMapStage 810) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 826) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 824 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.7 KB, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:36139 (size: 54.7 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 143 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 824 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 824.0 with 4 tasks [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 824.0 (TID 642, localhost, executor driver, partition 0, PROCESS_LOCAL, 8165 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 824.0 (TID 643, localhost, executor driver, partition 1, PROCESS_LOCAL, 8165 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 824.0 (TID 644, localhost, executor driver, partition 2, PROCESS_LOCAL, 8165 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 824.0 (TID 645, localhost, executor driver, partition 3, PROCESS_LOCAL, 8165 bytes) [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 824.0 (TID 642) [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 824.0 (TID 644) [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 824.0 (TID 643) [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 824.0 (TID 645) [Executor task launch worker for task 643] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally [Executor task launch worker for task 645] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally [Executor task launch worker for task 642] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally [Executor task launch worker for task 644] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 824.0 (TID 645). 59509 bytes result sent to driver [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 824.0 (TID 643). 59509 bytes result sent to driver [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 824.0 (TID 645) in 11 ms on localhost (executor driver) (1/4) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 824.0 (TID 643) in 11 ms on localhost (executor driver) (2/4) [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 824.0 (TID 644). 59509 bytes result sent to driver [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 824.0 (TID 642). 59509 bytes result sent to driver [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 824.0 (TID 644) in 15 ms on localhost (executor driver) (3/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 824.0 (TID 642) in 15 ms on localhost (executor driver) (4/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 824.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 824 (mapToPair at GroupCombineFunctions.java:57) finished in 0.022 s [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 825, ShuffleMapStage 826, ResultStage 839) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 825 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144 stored as values in memory (estimated size 216.3 KB, free 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:36139 (size: 64.1 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 144 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 825 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 825.0 with 5 tasks [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 825.0 (TID 646, localhost, executor driver, partition 0, PROCESS_LOCAL, 8436 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 825.0 (TID 647, localhost, executor driver, partition 1, PROCESS_LOCAL, 8436 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 825.0 (TID 648, localhost, executor driver, partition 2, PROCESS_LOCAL, 8436 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 825.0 (TID 649, localhost, executor driver, partition 3, PROCESS_LOCAL, 8436 bytes) [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 825.0 (TID 646) [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 825.0 (TID 648) [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 825.0 (TID 647) [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 825.0 (TID 649) [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 648] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_2 locally [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 649] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_3 locally [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 646] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_0 locally [Executor task launch worker for task 648] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 649] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_2 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 646] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_3 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_0 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 647] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_1 locally [Executor task launch worker for task 647] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_1 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 825.0 (TID 649). 59940 bytes result sent to driver [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 825.0 (TID 646). 59940 bytes result sent to driver [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 825.0 (TID 648). 59940 bytes result sent to driver [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 825.0 (TID 650, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes) [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 825.0 (TID 650) [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 825.0 (TID 647). 59940 bytes result sent to driver [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 825.0 (TID 649) in 13 ms on localhost (executor driver) (1/5) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 825.0 (TID 647) in 14 ms on localhost (executor driver) (2/5) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 825.0 (TID 646) in 15 ms on localhost (executor driver) (3/5) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 825.0 (TID 648) in 14 ms on localhost (executor driver) (4/5) [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 825.0 (TID 650). 59424 bytes result sent to driver [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 825.0 (TID 650) in 14 ms on localhost (executor driver) (5/5) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 825.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 825 (mapToPair at GroupCombineFunctions.java:57) finished in 0.034 s [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 826, ResultStage 839) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 826 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145 stored as values in memory (estimated size 217.5 KB, free 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:36139 (size: 64.1 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 826 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 826.0 with 5 tasks [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 826.0 (TID 651, localhost, executor driver, partition 0, PROCESS_LOCAL, 7638 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 826.0 (TID 652, localhost, executor driver, partition 1, PROCESS_LOCAL, 7638 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 826.0 (TID 653, localhost, executor driver, partition 2, PROCESS_LOCAL, 7638 bytes) [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 826.0 (TID 654, localhost, executor driver, partition 3, PROCESS_LOCAL, 7638 bytes) [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 826.0 (TID 651) [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 826.0 (TID 653) [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 826.0 (TID 654) [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 826.0 (TID 652) [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 826.0 (TID 654). 59896 bytes result sent to driver [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 826.0 (TID 652). 59853 bytes result sent to driver [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 826.0 (TID 651). 59896 bytes result sent to driver [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 826.0 (TID 653). 59896 bytes result sent to driver [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 826.0 (TID 655, localhost, executor driver, partition 4, PROCESS_LOCAL, 7638 bytes) [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 826.0 (TID 655) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 826.0 (TID 654) in 13 ms on localhost (executor driver) (1/5) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 826.0 (TID 653) in 14 ms on localhost (executor driver) (2/5) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 826.0 (TID 652) in 14 ms on localhost (executor driver) (3/5) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 826.0 (TID 651) in 14 ms on localhost (executor driver) (4/5) [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 826.0 (TID 655). 59853 bytes result sent to driver [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 826.0 (TID 655) in 12 ms on localhost (executor driver) (5/5) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 826.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 826 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.031 s [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 839) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146 stored as values in memory (estimated size 188.2 KB, free 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 58.1 KB, free 13.5 GB) [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:36139 (size: 58.1 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 839.0 with 4 tasks [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 839.0 (TID 656, localhost, executor driver, partition 0, PROCESS_LOCAL, 8132 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 839.0 (TID 657, localhost, executor driver, partition 1, PROCESS_LOCAL, 8132 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 839.0 (TID 658, localhost, executor driver, partition 2, PROCESS_LOCAL, 8132 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 839.0 (TID 659, localhost, executor driver, partition 3, PROCESS_LOCAL, 8132 bytes) [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 839.0 (TID 656) [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 839.0 (TID 659) [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 839.0 (TID 657) [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 839.0 (TID 658) [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 657] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_1 locally [Executor task launch worker for task 656] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_0 locally [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 659] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_3 locally [Executor task launch worker for task 657] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 656] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 659] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_1 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_0 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_3 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 658] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_2 locally [Executor task launch worker for task 658] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_2 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 839.0 (TID 657). 59881 bytes result sent to driver [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 839.0 (TID 659). 59881 bytes result sent to driver [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 839.0 (TID 656). 59881 bytes result sent to driver [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 839.0 (TID 658). 59881 bytes result sent to driver [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 839.0 (TID 657) in 13 ms on localhost (executor driver) (1/4) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 839.0 (TID 656) in 13 ms on localhost (executor driver) (2/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 839.0 (TID 659) in 14 ms on localhost (executor driver) (3/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 839.0 (TID 658) in 15 ms on localhost (executor driver) (4/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 839.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 839 (foreach at UnboundedDataset.java:79) finished in 0.022 s [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 finished: foreach at UnboundedDataset.java:79, took 0.118111 s [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1542073210000 ms.3 from job set of time 1542073210000 ms [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 6.488 s for time 1542073210000 ms (execution: 0.562 s) [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@69069e4c{/streaming,null,UNAVAILABLE,@Spark} [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@f992f72{/streaming/batch,null,UNAVAILABLE,@Spark} [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@2a4f5f5f{/static/streaming,null,UNAVAILABLE,@Spark} [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@4201964f{HTTP/1.1,[http/1.1]}{127.0.0.1:4040} [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040 [dispatcher-event-loop-2] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped! [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped! [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext Gradle Test Executor 289 finished executing tests. > Task :beam-runners-spark:validatesRunnerStreaming [Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-8f3b7e02-00a4-47b8-9070-48ce47b9a720 org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@58809b2e{HTTP/1.1,[http/1.1]}{127.0.0.1:4041} [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041 [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped! [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped! [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext Gradle Test Executor 293 finished executing tests. > Task :beam-runners-spark:validatesRunnerStreaming [Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-debc74ed-8618-4819-bebb-c70ae93286bb Finished generating test XML results (0.124 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming> Generating HTML test report... Finished generating test html results (0.115 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming> Packing task ':beam-runners-spark:validatesRunnerStreaming' :beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 10 mins 27.965 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':beam-runners-spark:validatesRunnerBatch'. > There were failing tests. See the report at: > file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 15m 24s 43 actionable tasks: 39 executed, 4 from cache Publishing build scan... https://gradle.com/s/v2kh6yu6iac6a Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org