See
<https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/167/display/redirect?page=changes>
Changes:
[mairbek] [BEAM-3932] Adds handling of array null values to
MutationSizeEstimator
[lcwik] [BEAM-3326] Abstract away closing the inbound receiver, waiting for the
[aromanenko.dev] [BEAM-3819] Add withRequestRecordsLimit() option to KinesisIO
[altay] [BEAM-3738] Add more flake8 tests to run_pylint.sh
[lcwik] [BEAM-3326] Address additional comments from PR/4963.
[lcwik] [BEAM-3104] Set up state interfaces, wire into SDK harness client.
------------------------------------------
[...truncated 2.76 MB...]
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 2.0 (TID 9) in 53 ms on localhost (executor
driver) (1/8)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 2.0 (TID 11) in 55 ms on localhost (executor
driver) (2/8)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 2.0 (TID 8) in 63 ms on localhost (executor
driver) (3/8)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 5.0 in stage 2.0 (TID 13)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 2.0 (TID 10). 13240 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 7.0 in stage 2.0 (TID 15, localhost, executor driver,
partition 7, PROCESS_LOCAL, 4730 bytes)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 7.0 in stage 2.0 (TID 15)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 2.0 (TID 10) in 66 ms on localhost (executor
driver) (4/8)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 4.0 in stage 2.0 (TID 12)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 6 ms
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_4 stored as values in memory (estimated size 16.0 B, free
1825.1 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_4 in memory on 127.0.0.1:37833 (size: 16.0 B, free: 1825.4
MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_6 stored as values in memory (estimated size 16.0 B, free
1825.1 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_6 in memory on 127.0.0.1:37833 (size: 16.0 B, free: 1825.4
MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 4.0 in stage 2.0 (TID 12). 11942 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 6.0 in stage 2.0 (TID 14). 11942 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_7 stored as values in memory (estimated size 16.0 B, free
1825.1 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 4.0 in stage 2.0 (TID 12) in 79 ms on localhost (executor
driver) (5/8)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_7 in memory on 127.0.0.1:37833 (size: 16.0 B, free: 1825.4
MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 6.0 in stage 2.0 (TID 14) in 82 ms on localhost (executor
driver) (6/8)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_5 stored as values in memory (estimated size 16.0 B, free
1825.1 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 7.0 in stage 2.0 (TID 15). 11985 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_5 in memory on 127.0.0.1:37833 (size: 16.0 B, free: 1825.4
MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 7.0 in stage 2.0 (TID 15) in 78 ms on localhost (executor
driver) (7/8)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 5.0 in stage 2.0 (TID 13). 11942 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 5.0 in stage 2.0 (TID 13) in 100 ms on localhost (executor
driver) (8/8)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 2.0, whose tasks have all completed, from pool
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 2 (collect at BoundedDataset.java:87) finished in 0.156 s
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 0 finished: collect at BoundedDataset.java:87, took 5.442581 s
Mar 29, 2018 11:04:50 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
enterCompositeTransform
INFO: Entering directly-translatable composite transform:
'WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values'
Mar 29, 2018 11:04:50 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
doVisitTransform
INFO: Evaluating Create.Values
Mar 29, 2018 11:04:50 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
doVisitTransform
INFO: Evaluating org.apache.beam.sdk.transforms.Reify$ReifyView$1@6a53b247
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_3 stored as values in memory (estimated size 672.0 B,
free 1825.1 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_3_piece0 stored as bytes in memory (estimated size 363.0
B, free 1825.1 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_3_piece0 in memory on 127.0.0.1:37833 (size: 363.0 B,
free: 1825.4 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 3 from broadcast at SideInputBroadcast.java:59
Mar 29, 2018 11:04:50 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
doVisitTransform
INFO: Evaluating org.apache.beam.sdk.transforms.MapElements$1@365cc925
Mar 29, 2018 11:04:50 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
doVisitTransform
INFO: Evaluating
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn@79b10126
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting job: foreach at BoundedDataset.java:117
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Got job 1 (foreach at BoundedDataset.java:117) with 4 output partitions
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Final stage: ResultStage 3 (foreach at BoundedDataset.java:117)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Parents of final stage: List()
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Missing parents: List()
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 3
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize).output
MapPartitionsRDD[53] at values at TransformTranslator.java:400), which has no
missing parents
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_4 stored as values in memory (estimated size 83.9 KB,
free 1825.0 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_4_piece0 stored as bytes in memory (estimated size 22.3
KB, free 1825.0 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_4_piece0 in memory on 127.0.0.1:37833 (size: 22.3 KB,
free: 1825.3 MB)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 4 from broadcast at DAGScheduler.scala:1006
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 3
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize).output
MapPartitionsRDD[53] at values at TransformTranslator.java:400) (first 15
tasks are for partitions Vector(0, 1, 2, 3))
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 3.0 with 4 tasks
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 3.0 (TID 16, localhost, executor driver,
partition 0, PROCESS_LOCAL, 4827 bytes)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 3.0 (TID 17, localhost, executor driver,
partition 1, PROCESS_LOCAL, 4827 bytes)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 3.0 (TID 18, localhost, executor driver,
partition 2, PROCESS_LOCAL, 4827 bytes)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 3.0 (TID 19, localhost, executor driver,
partition 3, PROCESS_LOCAL, 4837 bytes)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 3.0 (TID 16)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 3.0 (TID 17)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 3.0 (TID 18)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 3.0 (TID 19)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 3.0 (TID 16). 12184 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 3.0 (TID 16) in 63 ms on localhost (executor
driver) (1/4)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 3.0 (TID 18). 12184 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 3.0 (TID 17). 12184 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 3.0 (TID 18) in 66 ms on localhost (executor
driver) (2/4)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 3.0 (TID 17) in 67 ms on localhost (executor
driver) (3/4)
Mar 29, 2018 11:04:50 AM
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/.temp-beam-2018-03-29_11-04-41-0/9418c1e0-6640-4832-b292-6b205269feea,
shard=0,
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4fb7a209,
paneInfo=PaneInfo.NO_FIRING} to final location
/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/counts-00000-of-00004
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/.temp-beam-2018-03-29_11-04-41-0/a78b7b30-dee9-46b0-9849-da342cd6d62f,
shard=1,
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4fb7a209,
paneInfo=PaneInfo.NO_FIRING} to final location
/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/counts-00001-of-00004
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/.temp-beam-2018-03-29_11-04-41-0/308ce35f-ee2e-4c1e-bc12-3de3af1a5ae5,
shard=2,
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4fb7a209,
paneInfo=PaneInfo.NO_FIRING} to final location
/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/counts-00002-of-00004
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/.temp-beam-2018-03-29_11-04-41-0/7e16f2fd-6b15-4ddf-83a1-3d7944de9d34,
shard=3,
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4fb7a209,
paneInfo=PaneInfo.NO_FIRING} to final location
/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/counts-00003-of-00004
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/.temp-beam-2018-03-29_11-04-41-0/a78b7b30-dee9-46b0-9849-da342cd6d62f
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/.temp-beam-2018-03-29_11-04-41-0/308ce35f-ee2e-4c1e-bc12-3de3af1a5ae5
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/.temp-beam-2018-03-29_11-04-41-0/7e16f2fd-6b15-4ddf-83a1-3d7944de9d34
Mar 29, 2018 11:04:50 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-6392609454069710736-tmpdir/word-count-beam/.temp-beam-2018-03-29_11-04-41-0/9418c1e0-6640-4832-b292-6b205269feea
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 3.0 (TID 19). 15889 bytes result sent to driver
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 3.0 (TID 19) in 121 ms on localhost (executor
driver) (4/4)
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 3.0, whose tasks have all completed, from pool
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 3 (foreach at BoundedDataset.java:117) finished in 0.127 s
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:117, took 0.156819 s
Mar 29, 2018 11:04:50 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Mar 29, 2018 11:04:50 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@59c44ba6{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Mar 29, 2018 11:04:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:54 min
[INFO] Finished at: 2018-03-29T11:04:50Z
[INFO] Final Memory: 98M/1384M
[INFO] ------------------------------------------------------------------------
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]
Mar 29, 2018 12:07:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2018-03-29T12:07:22.014Z: Workflow failed. Causes: The Dataflow appears
to be stuck. You can get help with Cloud Dataflow at
https://cloud.google.com/dataflow/support.
Mar 29, 2018 12:07:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-29T12:07:22.121Z: Cancel request is committed for workflow job:
2018-03-29_04_02_45-10177233672216086838.
Mar 29, 2018 12:07:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-29T12:07:22.242Z: Cleaning up.
Mar 29, 2018 12:07:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-29T12:07:22.382Z: Stopping worker pool...
Mar 29, 2018 12:08:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-29T12:08:47.646Z: Autoscaling: Reduced the number of workers to 0
based on the rate of progress in the currently running step(s).
Mar 29, 2018 12:08:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob
waitUntilFinish
INFO: Job 2018-03-29_04_02_45-10177233672216086838 failed with status FAILED.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:07 h
[INFO] Finished at: 2018-03-29T12:08:56Z
[INFO] Final Memory: 40M/576M
[INFO] ------------------------------------------------------------------------
gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* |
grep Montague:
CommandException: No URLs matched:
gs://temp-storage-for-release-validation-tests/quickstart/count*
CommandException: No URLs matched:
gs://temp-storage-for-release-validation-tests/quickstart/count*
[ERROR] Failed command
:runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1h 8m 48s
6 actionable tasks: 6 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user
[email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]