See 
<https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/168/display/redirect?page=changes>

Changes:

[altay] Update streaming wordcount example and allign with the batch example.

[github] Fix linter error in typehints.

[wcn] Remove include directives for proto well-known-types.

------------------------------------------
[...truncated 2.76 MB...]
INFO: Running task 6.0 in stage 2.0 (TID 14)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 2.0 (TID 8) in 79 ms on localhost (executor 
driver) (2/8)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 2.0 (TID 11) in 74 ms on localhost (executor 
driver) (3/8)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_2 stored as values in memory (estimated size 1280.0 B, free 
1825.1 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_2 in memory on 127.0.0.1:46087 (size: 1280.0 B, free: 1825.4 
MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 2.0 (TID 10). 13283 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 7.0 in stage 2.0 (TID 15, localhost, executor driver, 
partition 7, PROCESS_LOCAL, 4730 bytes)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 2.0 (TID 10) in 97 ms on localhost (executor 
driver) (4/8)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 7.0 in stage 2.0 (TID 15)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_4 stored as values in memory (estimated size 16.0 B, free 
1825.1 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_4 in memory on 127.0.0.1:46087 (size: 16.0 B, free: 1825.4 
MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_6 stored as values in memory (estimated size 16.0 B, free 
1825.1 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_7 stored as values in memory (estimated size 16.0 B, free 
1825.1 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_6 in memory on 127.0.0.1:46087 (size: 16.0 B, free: 1825.4 
MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_7 in memory on 127.0.0.1:46087 (size: 16.0 B, free: 1825.4 
MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 7.0 in stage 2.0 (TID 15). 11942 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 6.0 in stage 2.0 (TID 14). 11942 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_5 stored as values in memory (estimated size 16.0 B, free 
1825.1 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_5 in memory on 127.0.0.1:46087 (size: 16.0 B, free: 1825.4 
MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 6.0 in stage 2.0 (TID 14) in 92 ms on localhost (executor 
driver) (5/8)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 7.0 in stage 2.0 (TID 15) in 68 ms on localhost (executor 
driver) (6/8)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 5.0 in stage 2.0 (TID 13). 11942 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 4.0 in stage 2.0 (TID 12). 11942 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 5.0 in stage 2.0 (TID 13) in 99 ms on localhost (executor 
driver) (7/8)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 4.0 in stage 2.0 (TID 12) in 107 ms on localhost (executor 
driver) (8/8)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 2.0, whose tasks have all completed, from pool 
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 2 (collect at BoundedDataset.java:87) finished in 0.173 s
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 0 finished: collect at BoundedDataset.java:87, took 4.016823 s
Mar 30, 2018 11:04:51 AM org.apache.beam.runners.spark.SparkRunner$Evaluator 
enterCompositeTransform
INFO: Entering directly-translatable composite transform: 
'WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values'
Mar 30, 2018 11:04:51 AM org.apache.beam.runners.spark.SparkRunner$Evaluator 
doVisitTransform
INFO: Evaluating Create.Values
Mar 30, 2018 11:04:51 AM org.apache.beam.runners.spark.SparkRunner$Evaluator 
doVisitTransform
INFO: Evaluating org.apache.beam.sdk.transforms.Reify$ReifyView$1@32e0ce6c
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_3 stored as values in memory (estimated size 672.0 B, 
free 1825.1 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_3_piece0 stored as bytes in memory (estimated size 362.0 
B, free 1825.1 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_3_piece0 in memory on 127.0.0.1:46087 (size: 362.0 B, 
free: 1825.4 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 3 from broadcast at SideInputBroadcast.java:59
Mar 30, 2018 11:04:51 AM org.apache.beam.runners.spark.SparkRunner$Evaluator 
doVisitTransform
INFO: Evaluating org.apache.beam.sdk.transforms.MapElements$1@5abf2f11
Mar 30, 2018 11:04:51 AM org.apache.beam.runners.spark.SparkRunner$Evaluator 
doVisitTransform
INFO: Evaluating 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn@4292b1d9
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting job: foreach at BoundedDataset.java:117
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Got job 1 (foreach at BoundedDataset.java:117) with 4 output partitions
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Final stage: ResultStage 3 (foreach at BoundedDataset.java:117)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Parents of final stage: List()
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Missing parents: List()
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 3 
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize).output
 MapPartitionsRDD[53] at values at TransformTranslator.java:400), which has no 
missing parents
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_4 stored as values in memory (estimated size 83.9 KB, 
free 1825.0 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_4_piece0 stored as bytes in memory (estimated size 22.3 
KB, free 1825.0 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_4_piece0 in memory on 127.0.0.1:46087 (size: 22.3 KB, 
free: 1825.3 MB)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 4 from broadcast at DAGScheduler.scala:1006
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 3 
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize).output
 MapPartitionsRDD[53] at values at TransformTranslator.java:400) (first 15 
tasks are for partitions Vector(0, 1, 2, 3))
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 3.0 with 4 tasks
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 3.0 (TID 16, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 4827 bytes)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 3.0 (TID 17, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 4827 bytes)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 3.0 (TID 18, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 4827 bytes)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 3.0 (TID 19, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 4837 bytes)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 3.0 (TID 17)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 3.0 (TID 16)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 3.0 (TID 19)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 3.0 (TID 18)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 3.0 (TID 17). 12184 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 3.0 (TID 16). 12141 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 3.0 (TID 17) in 72 ms on localhost (executor 
driver) (1/4)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 3.0 (TID 16) in 79 ms on localhost (executor 
driver) (2/4)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 3.0 (TID 18). 12141 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 3.0 (TID 18) in 77 ms on localhost (executor 
driver) (3/4)
Mar 30, 2018 11:04:51 AM 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/.temp-beam-2018-03-30_11-04-43-0/503d39b2-5ff8-4c85-955d-d5aa6d609401,
 shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@89137d7, 
paneInfo=PaneInfo.NO_FIRING} to final location 
/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/counts-00000-of-00004
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/.temp-beam-2018-03-30_11-04-43-0/36d5bec0-a255-4790-9cc1-aa1b3fd9e6a3,
 shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@89137d7, 
paneInfo=PaneInfo.NO_FIRING} to final location 
/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/counts-00001-of-00004
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/.temp-beam-2018-03-30_11-04-43-0/56aa1ef2-c5de-43d7-91ec-14ef0b78b727,
 shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@89137d7, 
paneInfo=PaneInfo.NO_FIRING} to final location 
/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/counts-00002-of-00004
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/.temp-beam-2018-03-30_11-04-43-0/3d737ad8-e965-47ba-a9ab-dcb259d3fe9b,
 shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@89137d7, 
paneInfo=PaneInfo.NO_FIRING} to final location 
/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/counts-00003-of-00004
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/.temp-beam-2018-03-30_11-04-43-0/56aa1ef2-c5de-43d7-91ec-14ef0b78b727
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/.temp-beam-2018-03-30_11-04-43-0/503d39b2-5ff8-4c85-955d-d5aa6d609401
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/.temp-beam-2018-03-30_11-04-43-0/3d737ad8-e965-47ba-a9ab-dcb259d3fe9b
Mar 30, 2018 11:04:51 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-2823552157073012210-tmpdir/word-count-beam/.temp-beam-2018-03-30_11-04-43-0/36d5bec0-a255-4790-9cc1-aa1b3fd9e6a3
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 3.0 (TID 19). 15889 bytes result sent to driver
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 3.0 (TID 19) in 125 ms on localhost (executor 
driver) (4/4)
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 3.0, whose tasks have all completed, from pool 
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 3 (foreach at BoundedDataset.java:117) finished in 0.132 s
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:117, took 0.161148 s
Mar 30, 2018 11:04:51 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Mar 30, 2018 11:04:51 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@a2118be{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Mar 30, 2018 11:04:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:48 min
[INFO] Finished at: 2018-03-30T11:04:51Z
[INFO] Final Memory: 90M/806M
[INFO] ------------------------------------------------------------------------
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]
Mar 30, 2018 12:07:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2018-03-30T12:07:13.005Z: Workflow failed. Causes: The Dataflow appears 
to be stuck. You can get help with Cloud Dataflow at 
https://cloud.google.com/dataflow/support.
Mar 30, 2018 12:07:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-30T12:07:13.128Z: Cancel request is committed for workflow job: 
2018-03-30_04_02_42-8928514858622684642.
Mar 30, 2018 12:07:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-30T12:07:13.242Z: Cleaning up.
Mar 30, 2018 12:07:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-30T12:07:13.288Z: Stopping worker pool...
Mar 30, 2018 12:08:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-30T12:08:36.315Z: Autoscaling: Reduced the number of workers to 0 
based on the rate of progress in the currently running step(s).
Mar 30, 2018 12:08:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
waitUntilFinish
INFO: Job 2018-03-30_04_02_42-8928514858622684642 failed with status FAILED.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:07 h
[INFO] Finished at: 2018-03-30T12:08:46Z
[INFO] Final Memory: 38M/534M
[INFO] ------------------------------------------------------------------------
gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* | 
grep Montague:
CommandException: No URLs matched: 
gs://temp-storage-for-release-validation-tests/quickstart/count*
CommandException: No URLs matched: 
gs://temp-storage-for-release-validation-tests/quickstart/count*
[ERROR] Failed command
:runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' 
> finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1h 8m 35s
6 actionable tasks: 6 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user 
[email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]

Reply via email to