See <https://builds.apache.org/job/beam_PostCommit_Python36/1123/display/redirect?page=changes>
Changes: [sunjincheng121] [BEAM-8733] Handle the registration request synchronously in the Python ------------------------------------------ [...truncated 557.95 KB...] ], "is_wrapper": true }, "format": "bigquery", "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s1" }, "schema": "{\"fields\": [{\"name\": \"fruit\", \"type\": \"STRING\", \"mode\": \"NULLABLE\"}]}", "table": "output_table", "user_name": "write/WriteToBigQuery/NativeWrite", "write_disposition": "WRITE_EMPTY" } } ], "type": "JOB_TYPE_BATCH" } apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job createTime: '2019-12-02T21:12:16.197630Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2019-12-02_13_12_14-9308632473352110995' location: 'us-central1' name: 'beamapp-jenkins-1202211205-040454' projectId: 'apache-beam-testing' stageStates: [] startTime: '2019-12-02T21:12:16.197630Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-02_13_12_14-9308632473352110995] apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_12_14-9308632473352110995?project=apache-beam-testing apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-02_13_12_14-9308632473352110995 is in state JOB_STATE_RUNNING apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:14.055Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-12-02_13_12_14-9308632473352110995. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:14.055Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-12-02_13_12_14-9308632473352110995. The number of workers will be between 1 and 1000. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:18.451Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:19.225Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:20.119Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:20.317Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:20.426Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:20.614Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:21.313Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:21.571Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into read apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:21.672Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:21.742Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:21.805Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:21.864Z: JOB_MESSAGE_DEBUG: Assigning stage ids. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:22.236Z: JOB_MESSAGE_DEBUG: Executing wait step start3 apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:22.305Z: JOB_MESSAGE_BASIC: Executing operation read+write/WriteToBigQuery/NativeWrite apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:22.351Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:22.383Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:24.855Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_2349317521065527605". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_2349317521065527605". apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:12:47.927Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:30.127Z: JOB_MESSAGE_BASIC: BigQuery query completed, job : "dataflow_job_2349317521065527605" apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:30.907Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_5523625681605120835" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_5523625681605120835". apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:01.262Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_5523625681605120835" observed total of 1 exported files thus far. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:01.285Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_5523625681605120835" apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:02.613Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 1250.0 in region us-central1. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:02.649Z: JOB_MESSAGE_ERROR: Workflow failed. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:02.700Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_2349317521065529015". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_2349317521065529015". apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:03.630Z: JOB_MESSAGE_WARNING: S01:read+write/WriteToBigQuery/NativeWrite failed. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:03.666Z: JOB_MESSAGE_BASIC: Finished operation read+write/WriteToBigQuery/NativeWrite apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:03.777Z: JOB_MESSAGE_DETAILED: Cleaning up. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:04.131Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:04.165Z: JOB_MESSAGE_BASIC: Stopping worker pool... apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:23.498Z: JOB_MESSAGE_BASIC: Worker pool stopped. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:14:23.527Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-02_13_12_14-9308632473352110995 is in state JOB_STATE_FAILED --------------------- >> end captured logging << --------------------- Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_02_45-15146059021751424153?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_23_06-3663239130926173531?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_31_45-7648118231018033516?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_40_36-14685110054899291515?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_02_47-1840969837137482263?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_15_01-11994529837062384138?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_24_20-15955406810939498654?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_31_54-15786643903200055410?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_02_45-14830812329102796621?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_21_24-8663789958778345739?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_39_08-17149874882234151477?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_48_05-11409312535594028669?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_02_45-18134582669203543330?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_12_14-9308632473352110995?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_14_47-7829035981133461364?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_23_03-5454402782233033550?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_31_27-13752996596709287664?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_39_28-17632037571658061990?project=apache-beam-testing kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_02_47-11963255821928776210?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_12_22-2667695930255845389?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_14_35-9931022524169043902?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_24_25-17102230595379914042?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_32_07-12052984817026574972?project=apache-beam-testing temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_02_44-16323105184679290742?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_11_45-317118225600620312?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_20_31-2732891175942890995?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_29_53-15231053432953929741?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_39_11-7402661621308202136?project=apache-beam-testing kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_02_46-4721009526151551131?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_12_10-2912941504561268204?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_20_08-17637059656142053197?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_28_26-12894970421678752870?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_37_24-12747407210346543883?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_45_27-16349972905175589782?project=apache-beam-testing ====================================================================== ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/plugins/multiprocess.py",> line 812, in run test(orig) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/case.py",> line 46, in __call__ return self.run(*arg, **kwarg) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/case.py",> line 134, in run self.runTest(result) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/case.py",> line 152, in runTest test(result) File "/usr/lib/python3.6/unittest/case.py", line 653, in __call__ return self.run(*args, **kwds) File "/usr/lib/python3.6/unittest/case.py", line 605, in run testMethod() File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform equal_to([(full_output_table_1, bad_record)])) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__ self.run().wait_until_finish() File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run self._options).run(False) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run return self.runner.run_pipeline(self, self._options) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline self.wait_until_in_state(PipelineState.CANCELLED) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state job_state = self.result.state File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state self._update_job() File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job self._job = self._runner.dataflow_client.get_job(self.job_id()) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job response = self._client.projects_locations_jobs.Get(request) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod http, http_request, **opts) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest check_response_func=check_response_func) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry redirections=redirections, connection_type=connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/oauth2client/transport.py",> line 169, in new_request redirections, connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/oauth2client/transport.py",> line 169, in new_request redirections, connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/httplib2/__init__.py",> line 1924, in request cachekey, File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/httplib2/__init__.py",> line 1595, in _request conn, request_uri, method, body, headers File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/httplib2/__init__.py",> line 1533, in _conn_request response = conn.getresponse() File "/usr/lib/python3.6/http/client.py", line 1331, in getresponse response.begin() File "/usr/lib/python3.6/http/client.py", line 297, in begin version, status, reason = self._read_status() File "/usr/lib/python3.6/http/client.py", line 258, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") File "/usr/lib/python3.6/socket.py", line 586, in readinto return self._sock.recv_into(b) File "/usr/lib/python3.6/ssl.py", line 1012, in recv_into return self.read(nbytes, buffer) File "/usr/lib/python3.6/ssl.py", line 874, in read return self._sslobj.read(len, buffer) File "/usr/lib/python3.6/ssl.py", line 631, in read v = self._sslobj.read(len, buffer) File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler raise TimedOutException() nose.plugins.multiprocess.TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)' ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py36.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 45 tests in 5397.349s FAILED (SKIP=7, errors=2) > Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 30m 55s 83 actionable tasks: 62 executed, 21 from cache Publishing build scan... https://scans.gradle.com/s/xzrldjf4gj6ms Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org