Can this fail https://ci-beam.apache.org/job/beam_PreCommit_Java_Phrase/3745/ 
be related ? Though, there is only one test that fails.

> On 17 Jun 2021, at 18:50, Alex Amato <[email protected]> wrote:
> 
> Hmm, perhaps it only happens sometimes. The other half of the time I "Run 
> Java Precommit" on this PR I hit this different failure:
> 
> The connection is not obvious to me, if its related to my PR. 
> https://github.com/apache/beam/pull/14804 
> <https://github.com/apache/beam/pull/14804>
> I only added some Precondition checks. But I don't see those failing anywhere.
> (Unless something indirect is causing it and stacktrace for that is not 
> printed, i.e. like in a subprocess).
> 
> Any ideas? Are these tests known to be failing right now?
> ----
> https://ci-beam.apache.org/job/beam_PreCommit_Java_Phrase/3742/#showFailuresLink
>  
> <https://ci-beam.apache.org/job/beam_PreCommit_Java_Phrase/3742/#showFailuresLink>
> 
>  Test Result (32 failures / +32)
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteScriptedUpsert
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testReadWithMetadata
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithIndexFn
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testMaxParallelRequestsPerWindow
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteRetryValidRequest
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithMaxBatchSize
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteRetry
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testReadWithQueryString
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testSizes
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithMaxBatchSizeBytes
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithDocVersion
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithAllowableErrors
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithTypeFn
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteScriptedUpsert
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testReadWithQueryValueProvider
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testSplit
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteRetryValidRequest
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithDocVersion
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testSizes
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testMaxParallelRequestsPerWindow
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testReadWithQueryString
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWritePartialUpdate
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithMaxBatchSizeBytes
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testDefaultRetryPredicate
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithIndexFn
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithRouting
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteRetry
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testReadWithMetadata
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteFullAddressing
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithMaxBatchSize
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWriteWithIsDeleteFn
> org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOTest.testWrite
> 
> On Wed, Jun 16, 2021 at 5:24 PM Robert Burke <[email protected] 
> <mailto:[email protected]>> wrote:
> Very odd as those paths do resolve now, redirecting to their pkg.go.dev 
> <http://pkg.go.dev/> paths. Very odd. This feels transient, but it's not 
> clear why that would return a 404 vs some other error.
> 
> On Wed, 16 Jun 2021 at 15:39, Kyle Weaver <[email protected] 
> <mailto:[email protected]>> wrote:
> For tasks without structured JUnit output, we have to scroll up / ctrl-f / 
> grep for more logs. In this case it looks like it was probably a server-side 
> issue. These links work for me, so I'm assuming the problem has been resolved.
> 
> 11:31:04 > Task :release:go-licenses:java:dockerRun <>
> 11:31:04 package google.golang.org/protobuf/reflect/protoreflect 
> <http://google.golang.org/protobuf/reflect/protoreflect>: unrecognized import 
> path "google.golang.org/protobuf/reflect/protoreflect 
> <http://google.golang.org/protobuf/reflect/protoreflect>": reading 
> https://google.golang.org/protobuf/reflect/protoreflect?go-get=1 
> <https://google.golang.org/protobuf/reflect/protoreflect?go-get=1>: 404 Not 
> Found
> 11:31:04 package google.golang.org/protobuf/runtime/protoimpl 
> <http://google.golang.org/protobuf/runtime/protoimpl>: unrecognized import 
> path "google.golang.org/protobuf/runtime/protoimpl 
> <http://google.golang.org/protobuf/runtime/protoimpl>": reading 
> https://google.golang.org/protobuf/runtime/protoimpl?go-get=1 
> <https://google.golang.org/protobuf/runtime/protoimpl?go-get=1>: 404 Not Found
> 11:31:04 package google.golang.org/protobuf/types/descriptorpb 
> <http://google.golang.org/protobuf/types/descriptorpb>: unrecognized import 
> path "google.golang.org/protobuf/types/descriptorpb 
> <http://google.golang.org/protobuf/types/descriptorpb>": reading 
> https://google.golang.org/protobuf/types/descriptorpb?go-get=1 
> <https://google.golang.org/protobuf/types/descriptorpb?go-get=1>: 404 Not 
> Found
> 11:31:04 package google.golang.org/protobuf/types/known/durationpb 
> <http://google.golang.org/protobuf/types/known/durationpb>: unrecognized 
> import path "google.golang.org/protobuf/types/known/durationpb 
> <http://google.golang.org/protobuf/types/known/durationpb>": reading 
> https://google.golang.org/protobuf/types/known/durationpb?go-get=1 
> <https://google.golang.org/protobuf/types/known/durationpb?go-get=1>: 404 Not 
> Found
> 
> 
> On Wed, Jun 16, 2021 at 2:35 PM Alex Amato <[email protected] 
> <mailto:[email protected]>> wrote:
> For PR: https://github.com/apache/beam/pull/14804 
> <https://github.com/apache/beam/pull/14804>
> 
> Is something wrong on this machine? preventing it from running docker? Seems 
> to happen a few times after a run again as well.
> 
> Anything I can do here to move my PR forward and get it merged?
> 
> https://ci-beam.apache.org/job/beam_PreCommit_Java_Phrase/3735/consoleFull 
> <https://ci-beam.apache.org/job/beam_PreCommit_Java_Phrase/3735/consoleFull>
> 
> 11:36:42 > Task :sdks:java:core:buildDependents <>
> 11:36:42 
> 11:36:42 FAILURE: Build failed with an exception.
> 11:36:42 
> 11:36:42 * What went wrong:
> 11:36:42 Execution failed for task ':release:go-licenses:java:dockerRun'.
> 11:36:42 > Process 'command 'docker'' finished with non-zero exit value 1
> 11:36:42 
> 11:36:42 * Try:
> 11:36:42 Run with --stacktrace option to get the stack trace. Run with --info 
> or --debug option to get more log output. Run with --scan to get full 
> insights.
> 11:36:42 
> 11:36:42 * Get more help at https://help.gradle.org <https://help.gradle.org/>
> 11:36:42 
> 11:36:42 Deprecated Gradle features were used in this build, making it 
> incompatible with Gradle 7.0.
> 11:36:42 Use '--warning-mode all' to show the individual deprecation warnings.
> 11:36:42 See 
> https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
>  
> <https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings>
> 11:36:42 
> 11:36:42 BUILD FAILED in 7m 20s
> 11:36:42 1134 actionable tasks: 420 executed, 712 from cache, 2 up-to-date
> 11:36:43 
> 11:36:43 Publishing build scan...
> 11:36:43 https://gradle.com/s/3yfusnsnfll62 
> <https://gradle.com/s/3yfusnsnfll62>

Reply via email to