See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/613/display/redirect>

Changes:


------------------------------------------
[...truncated 458.61 KB...]
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:860 Created job 
with id: [2024-01-23_12_55_57-11400971046939635074]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:861 Submitted job: 
2024-01-23_12_55_57-11400971046939635074
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:862 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2024-01-23_12_55_57-11400971046939635074?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2024-01-23_12_55_57-11400971046939635074?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:151 Job 
2024-01-23_12_55_57-11400971046939635074 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T20:56:00.898Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-a.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T20:56:03.473Z: JOB_MESSAGE_BASIC: Executing operation 
Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T20:56:03.532Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-a...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T20:56:03.904Z: JOB_MESSAGE_BASIC: Finished operation 
Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T20:56:12.672Z: JOB_MESSAGE_BASIC: Executing operation 
Generate/Impulse+Generate/FlatMap(<lambda at 
core.py:3774>)+Generate/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T20:56:22.388Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:10:52.237Z: JOB_MESSAGE_BASIC: Cancel request is committed for 
workflow job: 2024-01-23_12_55_57-11400971046939635074.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:10:52.258Z: JOB_MESSAGE_BASIC: Finished operation 
Generate/Impulse+Generate/FlatMap(<lambda at 
core.py:3774>)+Generate/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:10:52.492Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:151 Job 
2024-01-23_12_55_57-11400971046939635074 is in state JOB_STATE_CANCELLING
__________ CrossLanguageKafkaIOTest.test_hosted_kafkaio_populated_key 
__________

self = <apache_beam.io.external.xlang_kafkaio_it_test.CrossLanguageKafkaIOTest 
testMethod=test_hosted_kafkaio_populated_key>

    @pytest.mark.uses_io_java_expansion_service
    @unittest.skipUnless(
        os.environ.get('EXPANSION_PORT'),
        "EXPANSION_PORT environment var is not provided.")
    @unittest.skipUnless(
        os.environ.get('KAFKA_BOOTSTRAP_SERVER'),
        "KAFKA_BOOTSTRAP_SERVER environment var is not provided.")
    def test_hosted_kafkaio_populated_key(self):
      kafka_topic = 'xlang_kafkaio_test_populated_key_{}'.format(uuid.uuid4())
      bootstrap_servers = os.environ.get('KAFKA_BOOTSTRAP_SERVER')
      pipeline_creator = CrossLanguageKafkaIO(
          bootstrap_servers,
          kafka_topic,
          False,
          'localhost:%s' % os.environ.get('EXPANSION_PORT'))
    
>     self.run_kafka_write(pipeline_creator)

apache_beam/io/external/xlang_kafkaio_it_test.py:162: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/io/external/xlang_kafkaio_it_test.py:187: in 
run_kafka_write
    pipeline_creator.build_write_pipeline(pipeline)
apache_beam/pipeline.py:612: in __exit__
    self.result = self.run()
apache_beam/testing/test_pipeline.py:112: in run
    result = super().run(
apache_beam/pipeline.py:586: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:66: in 
run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20240123221335907403-6044'
 createTime: '2024-01-23T22:13:37.983101Z'
...024-01-23T22:13:37.983101Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7fa2d039b7c0>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
        consoleUrl = (
            "Console URL: https://console.cloud.google.com/";
            f"dataflow/jobs/<RegionId>/{self.job_id()}"
            "?project=<ProjectId>")
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
>         time.sleep(5.0)
E         Failed: Timeout >4500.0s

apache_beam/runners/dataflow/dataflow_runner.py:765: Failed
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/build/apache_beam-2.54.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:314 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest
INFO     root:environments.py:321 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7fa2a01ec9d0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7fa2a01f31f0> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:677 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0123221335-906352-fvm9ee07.1706048015.906517/beam-sdks-java-io-expansion-service-2.54.0-SNAPSHOT-YSoXcc4-uUKsiJNOxeWC8_0WbKOWyGs9WAep2ZkEmM8.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:687 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0123221335-906352-fvm9ee07.1706048015.906517/beam-sdks-java-io-expansion-service-2.54.0-SNAPSHOT-YSoXcc4-uUKsiJNOxeWC8_0WbKOWyGs9WAep2ZkEmM8.jar
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:677 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0123221335-906352-fvm9ee07.1706048015.906517/apache_beam-2.54.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:687 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0123221335-906352-fvm9ee07.1706048015.906517/apache_beam-2.54.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:677 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0123221335-906352-fvm9ee07.1706048015.906517/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:687 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0123221335-906352-fvm9ee07.1706048015.906517/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:858 Create job: 
<Job
 clientRequestId: '20240123221335907403-6044'
 createTime: '2024-01-23T22:13:37.983101Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2024-01-23_14_13_37-12592457088122517646'
 location: 'us-central1'
 name: 'beamapp-jenkins-0123221335-906352-fvm9ee07'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2024-01-23T22:13:37.983101Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:860 Created job 
with id: [2024-01-23_14_13_37-12592457088122517646]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:861 Submitted job: 
2024-01-23_14_13_37-12592457088122517646
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:862 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2024-01-23_14_13_37-12592457088122517646?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2024-01-23_14_13_37-12592457088122517646?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:151 Job 
2024-01-23_14_13_37-12592457088122517646 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:13:43.741Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:13:45.897Z: JOB_MESSAGE_BASIC: Executing operation 
Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:13:45.957Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:13:46.505Z: JOB_MESSAGE_BASIC: Finished operation 
Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:13:55.100Z: JOB_MESSAGE_BASIC: Executing operation 
Generate/Impulse+Generate/FlatMap(<lambda at 
core.py:3774>)+Generate/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T22:13:57.708Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T23:28:33.972Z: JOB_MESSAGE_BASIC: Cancel request is committed for 
workflow job: 2024-01-23_14_13_37-12592457088122517646.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T23:28:33.995Z: JOB_MESSAGE_BASIC: Finished operation 
Generate/Impulse+Generate/FlatMap(<lambda at 
core.py:3774>)+Generate/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:201 
2024-01-23T23:28:34.135Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:151 Job 
2024-01-23_14_13_37-12592457088122517646 is in state JOB_STATE_CANCELLING
=============================== warnings summary 
===============================
apache_beam/io/gcp/bigquery.py:2603
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2603:
 DeprecationWarning: invalid escape sequence \#
    """Read data from BigQuery.

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/pytest_ioCrossLanguage.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/io/external/xlang_kafkaio_it_test.py::CrossLanguageKafkaIOTest::test_hosted_kafkaio_null_key
 - Failed: Timeout >4500.0s
FAILED 
apache_beam/io/external/xlang_kafkaio_it_test.py::CrossLanguageKafkaIOTest::test_hosted_kafkaio_populated_key
 - Failed: Timeout >4500.0s
= 2 failed, 2 passed, 17 skipped, 7309 
deselected, 1 warning in 9376.83s (2:36:16) =

+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++

~~~~~~~~~~~~~~~~~~~~~ Stack of Thread-2 (140336877094656) ~~~~~~~~~~~~~~~~~~~~~~
  File "/usr/lib/python3.8/threading.py", line 890, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 144, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 298, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 929, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 747, in Get
    return self._RunMethod(config, request, global_params=global_params)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py";,>
 line 728, in _RunMethod
    http_response = http_wrapper.MakeRequest(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/http_wrapper.py";,>
 line 348, in MakeRequest
    return _MakeRequestNoRetry(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/http_wrapper.py";,>
 line 397, in _MakeRequestNoRetry
    info, content = http.request(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/google_auth_httplib2.py";,>
 line 218, in request
    response, content = self.http.request(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/httplib2/__init__.py";,>
 line 1724, in request
    (response, content) = self._request(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/httplib2/__init__.py";,>
 line 1444, in _request
    (response, content) = self._conn_request(conn, request_uri, method, body, 
headers)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/httplib2/__init__.py";,>
 line 1396, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python3.8/http/client.py", line 1348, in getresponse
    response.begin()
  File "/usr/lib/python3.8/http/client.py", line 316, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.8/http/client.py", line 277, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "/usr/lib/python3.8/socket.py", line 669, in readinto
    return self._sock.recv_into(b)
  File "/usr/lib/python3.8/ssl.py", line 1270, in recv_into
    return self.read(nbytes, buffer)
  File "/usr/lib/python3.8/ssl.py", line 1128, in read
    return self._sslobj.read(len, buffer)

+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++

+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++

~~~~~~~~~~~~~~~~~~~~~ Stack of Thread-4 (140337011328768) ~~~~~~~~~~~~~~~~~~~~~~
  File "/usr/lib/python3.8/threading.py", line 890, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 172, in poll_for_job_completion
    time.sleep(sleep_secs)

+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++

> Task :sdks:python:test-suites:dataflow:py38:ioCrossLanguagePythonUsingJava 
> FAILED

> Task :sdks:python:test-suites:dataflow:py38:ioCrossLanguageCleanup
Stopping expansion service pid: 992130.
Skipping invalid pid: 992131.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 989313

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py311:ioCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py38:ioCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 2h 53m 17s
108 actionable tasks: 95 executed, 11 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/5xv4pptfyxjj4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to