All of these are great suggestions. I think what I really need though is
some way to figure out how to cleanly install (perhaps reinstalling)
everything I need to run all these commands. tox, yapf,

As I keep getting errors, try to install a dep I think I am missing, rinse
an repeat, and not quite getting to a state where I have reliable tooling.

On Mon, Nov 2, 2020 at 12:40 PM Sam Rohde <[email protected]> wrote:

> I personally run `tox -e py37-lint` and `tox -e py3-yapf` from the
> root/sdks/python directory and that catches most stuff. If you are adding
> type annotations then also running `tox -e py37-mypy` is a good choice.
> Note that tox supports tab completion, so you can see all the different
> options by double-pressing tab with `tox -e` in the root/sdks/python
> directory.
>
> On Wed, Oct 28, 2020 at 8:52 PM Alex Amato <[email protected]> wrote:
>
>> Thanks Chad, this was helpful. :)
>>
>> Btw, I think this helps my PR format somewhat, but some more checks are
>> ru, not covered by this tool when I push the PR.
>>
>> My PR is running more checks under
>> *:sdks:python:test-suites:tox:py37:mypyPy37*
>>
>> I am curious if anyone knows a good command line to try before pushing
>> PRs to catch these issues locally first? (I had one in the past, but I
>> think its outdated).
>>
>>
>>
>> On Wed, Oct 28, 2020 at 8:41 PM Pablo Estrada <[email protected]> wrote:
>>
>>> woah I didn't know about this tool at all Chad. It looks nice : )
>>> FWIW, if you feel up to it, I've given you edit access to the Beam wiki (
>>> https://cwiki.apache.org/confluence/display/BEAM) in case you'd like to
>>> add the tip.
>>> Thanks!
>>> -P.
>>>
>>> On Wed, Oct 28, 2020 at 8:09 PM Chad Dombrova <[email protected]> wrote:
>>>
>>>> I would like to edit it!  I have an apache account and I am a committed
>>>> but IIRC I could not edit it with my normal credentials.
>>>>
>>>>
>>>> On Wed, Oct 28, 2020 at 8:02 PM Robert Burke <[email protected]>
>>>> wrote:
>>>>
>>>>> (it's a wiki, so anyone who requests and account can improve it)
>>>>>
>>>>> On Wed, Oct 28, 2020, 7:45 PM Chad Dombrova <[email protected]> wrote:
>>>>>
>>>>>> It’s unfortunate that those instructions don’t include pre-commit,
>>>>>> which is by far the easiest way to do this.
>>>>>>
>>>>>> To set it up:
>>>>>>
>>>>>> pip install pre-commit
>>>>>> pre-commit install
>>>>>>
>>>>>> Install sets up git pre-commit hooks so that it will run yapf and
>>>>>> pylint on changed files every time you commit (you’ll need python3.7. I
>>>>>> think it should be possible to loosen this, as this has been an annoyance
>>>>>> for me)
>>>>>>
>>>>>> To skip running the check on commit add -n:
>>>>>>
>>>>>> git commit -nm "blah blah"
>>>>>>
>>>>>> Alternatively, to run the check manually on changed files (pre-commit
>>>>>> install is not required to run it this way):
>>>>>>
>>>>>> pre-commit run yapf
>>>>>>
>>>>>> Or on all files:
>>>>>>
>>>>>> pre-commit run -a yapf
>>>>>>
>>>>>> More info here: https://pre-commit.com/#config-language_version
>>>>>>
>>>>>> On Wed, Oct 28, 2020 at 6:46 PM Alex Amato <[email protected]>
>>>>>> wrote:
>>>>>>
>>>>>>> I tried both the tox and yapf instructions on the python tips page
>>>>>>> <https://cwiki.apache.org/confluence/display/BEAM/Python+Tips#PythonTips-Formatting>.
>>>>>>> And the gradle target which failed on PR precommit. I am wondering if 
>>>>>>> there
>>>>>>> is something additional I need to setup?
>>>>>>>
>>>>>>> Here is the output from all three attempts approaches I attempted.
>>>>>>> Any ideas how to get this working?
>>>>>>>
>>>>>>> *(ajamato_env2) ajamato@ajamato-linux0:~/beam/sdks/python$ git diff
>>>>>>> --name-only --relative bigquery_python_sdk origin/master | xargs yapf
>>>>>>> --in-place*
>>>>>>> Traceback (most recent call last):
>>>>>>>   File "/usr/local/google/home/ajamato/.local/bin/yapf", line 8, in
>>>>>>> <module>
>>>>>>>     sys.exit(run_main())
>>>>>>>   File
>>>>>>> "/usr/local/google/home/ajamato/.local/lib/python2.7/site-packages/yapf/__init__.py",
>>>>>>> line 365, in run_main
>>>>>>>     sys.exit(main(sys.argv))
>>>>>>>   File
>>>>>>> "/usr/local/google/home/ajamato/.local/lib/python2.7/site-packages/yapf/__init__.py",
>>>>>>> line 135, in main
>>>>>>>     verbose=args.verbose)
>>>>>>>   File
>>>>>>> "/usr/local/google/home/ajamato/.local/lib/python2.7/site-packages/yapf/__init__.py",
>>>>>>> line 204, in FormatFiles
>>>>>>>     in_place, print_diff, verify, quiet, verbose)
>>>>>>>   File
>>>>>>> "/usr/local/google/home/ajamato/.local/lib/python2.7/site-packages/yapf/__init__.py",
>>>>>>> line 233, in _FormatFile
>>>>>>>     logger=logging.warning)
>>>>>>>   File
>>>>>>> "/usr/local/google/home/ajamato/.local/lib/python2.7/site-packages/yapf/yapflib/yapf_api.py",
>>>>>>> line 100, in FormatFile
>>>>>>>     verify=verify)
>>>>>>>   File
>>>>>>> "/usr/local/google/home/ajamato/.local/lib/python2.7/site-packages/yapf/yapflib/yapf_api.py",
>>>>>>> line 147, in FormatCode
>>>>>>>     tree = pytree_utils.ParseCodeToTree(unformatted_source)
>>>>>>>   File
>>>>>>> "/usr/local/google/home/ajamato/.local/lib/python2.7/site-packages/yapf/yapflib/pytree_utils.py",
>>>>>>> line 127, in ParseCodeToTree
>>>>>>>     raise e
>>>>>>>   File "apache_beam/metrics/execution.pxd", line 18
>>>>>>>     cimport cython
>>>>>>>                  ^
>>>>>>> SyntaxError: invalid syntax
>>>>>>>
>>>>>>> *(ajamato_env2) ajamato@ajamato-linux0:~/beam/sdks/python$ tox -e
>>>>>>> py3-yapf*
>>>>>>> GLOB sdist-make:
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/setup.py
>>>>>>> py3-yapf create:
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/target/.tox/py3-yapf
>>>>>>> ERROR: invocation failed (exit code 1), logfile:
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/target/.tox/py3-yapf/log/py3-yapf-0.log
>>>>>>> ===============================================================================================
>>>>>>> log start
>>>>>>> ================================================================================================
>>>>>>> RuntimeError: failed to build image pkg_resources because:
>>>>>>> Traceback (most recent call last):
>>>>>>>   File
>>>>>>> "/usr/lib/python3/dist-packages/virtualenv/seed/embed/via_app_data/via_app_data.py",
>>>>>>> line 60, in _install
>>>>>>>     installer.install(creator.interpreter.version_info)
>>>>>>>   File
>>>>>>> "/usr/lib/python3/dist-packages/virtualenv/seed/embed/via_app_data/pip_install/base.py",
>>>>>>> line 42, in install
>>>>>>>     self._sync(filename, into)
>>>>>>>   File
>>>>>>> "/usr/lib/python3/dist-packages/virtualenv/seed/embed/via_app_data/pip_install/copy.py",
>>>>>>> line 13, in _sync
>>>>>>>     copy(src, dst)
>>>>>>>   File
>>>>>>> "/usr/lib/python3/dist-packages/virtualenv/util/path/_sync.py", line 
>>>>>>> 53, in
>>>>>>> copy
>>>>>>>     method(norm(src), norm(dest))
>>>>>>>   File
>>>>>>> "/usr/lib/python3/dist-packages/virtualenv/util/path/_sync.py", line 
>>>>>>> 64, in
>>>>>>> copytree
>>>>>>>     shutil.copy(src_f, dest_f)
>>>>>>>   File "/usr/lib/python3.8/shutil.py", line 415, in copy
>>>>>>>     copyfile(src, dst, follow_symlinks=follow_symlinks)
>>>>>>>   File "/usr/lib/python3.8/shutil.py", line 261, in copyfile
>>>>>>>     with open(src, 'rb') as fsrc, open(dst, 'wb') as fdst:
>>>>>>> FileNotFoundError: [Errno 2] No such file or directory:
>>>>>>> '/usr/local/google/home/ajamato/beam/sdks/python/target/.tox/py3-yapf/lib/python3.8/site-packages/pkg_resources/_vendor/packaging/__init__.py'
>>>>>>>
>>>>>>>
>>>>>>> ================================================================================================
>>>>>>> log end
>>>>>>> =================================================================================================
>>>>>>> ERROR: InvocationError for command /usr/bin/python3 -m virtualenv
>>>>>>> --no-download --python /usr/bin/python3 py3-yapf (exited with code 1)
>>>>>>> ________________________________________________________________________________________________
>>>>>>> summary
>>>>>>> _________________________________________________________________________________________________
>>>>>>> ERROR:   py3-yapf: InvocationError for command /usr/bin/python3 -m
>>>>>>> virtualenv --no-download --python /usr/bin/python3 py3-yapf (exited with
>>>>>>> code 1)
>>>>>>> (ajamato_env2) ajamato@ajamato-linux0:~/beam/sdks/python$
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *ajamato@ajamato-linux0:~/beam$ ./gradlew
>>>>>>> :sdks:python:test-suites:tox:py38:formatter*
>>>>>>> To honour the JVM settings for this build a new JVM will be forked.
>>>>>>> Please consider using the daemon:
>>>>>>> https://docs.gradle.org/6.6.1/userguide/gradle_daemon.html.
>>>>>>> Daemon will be stopped at the end of the build stopping after
>>>>>>> processing
>>>>>>> Configuration on demand is an incubating feature.
>>>>>>>
>>>>>>> > Task :sdks:python:test-suites:tox:py38:formatter
>>>>>>> GLOB sdist-make:
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/setup.py
>>>>>>> py3-yapf-check recreate:
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py3-yapf-check/py3-yapf-check
>>>>>>> py3-yapf-check installdeps: yapf==0.29.0
>>>>>>> py3-yapf-check inst:
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py3-yapf-check/.tmp/package/1/apache-beam-2.26.0.dev0.zip
>>>>>>> py3-yapf-check installed: apache-beam @
>>>>>>> file:///usr/local/google/home/ajamato/beam/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py3-yapf-check/.tmp/package/1/apache-beam-2.26.0.dev0.zip,apipkg==1.5,atomicwrites==1.4.0,attrs==20.2.0,avro-python3==1.9.2.1,blindspin==2.0.1,certifi==2020.6.20,chardet==3.0.4,colorama==0.4.4,crayons==0.4.0,crcmod==1.7,deprecation==2.1.0,dill==0.3.1.1,docker==4.3.1,docopt==0.6.2,execnet==1.7.1,fastavro==1.0.0.post1,freezegun==1.0.0,future==0.18.2,grpcio==1.33.2,hdfs==2.5.8,httplib2==0.17.4,idna==2.10,mock==2.0.0,more-itertools==8.5.0,nose==1.3.7,nose-xunitmp==0.4.1,numpy==1.19.3,oauth2client==4.1.3,packaging==20.4,pandas==1.1.3,parameterized==0.7.4,pbr==5.5.1,pluggy==0.13.1,protobuf==3.13.0,psycopg2-binary==2.8.6,py==1.9.0,pyarrow==0.17.1,pyasn1==0.4.8,pyasn1-modules==0.2.8,pydot==1.4.1,PyHamcrest==1.10.1,pymongo==3.11.0,pyparsing==2.4.7,pytest==4.6.11,pytest-forked==1.3.0,pytest-timeout==1.4.2,pytest-xdist==1.34.0,python-dateutil==2.8.1,pytz==2020.1,PyYAML==5.3.1,requests==2.24.0,requests-mock==1.8.0,rsa==4.6,six==1.15.0,SQLAlchemy==1.3.20,tenacity==5.1.5,testcontainers==3.1.0,typing-extensions==3.7.4.3,urllib3==1.25.11,wcwidth==0.2.5,websocket-client==0.57.0,wrapt==1.12.1,yapf==0.29.0
>>>>>>> py3-yapf-check run-test-pre: PYTHONHASHSEED='2074298265
>>>>>>> <(207)%20429-8265>'
>>>>>>> py3-yapf-check run-test-pre: commands[0] | python --version
>>>>>>> Python 3.8.5
>>>>>>> py3-yapf-check run-test-pre: commands[1] | pip --version
>>>>>>> pip 20.2.4 from
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py3-yapf-check/py3-yapf-check/lib/python3.8/site-packages/pip
>>>>>>> (python 3.8)
>>>>>>> py3-yapf-check run-test-pre: commands[2] | pip check
>>>>>>> No broken requirements found.
>>>>>>> py3-yapf-check run-test-pre: commands[3] | bash
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh
>>>>>>> py3-yapf-check run-test: commands[0] | yapf --version
>>>>>>> yapf 0.29.0
>>>>>>> py3-yapf-check run-test: commands[1] | time yapf --diff --parallel
>>>>>>> --recursive apache_beam
>>>>>>> --- apache_beam/runners/worker/sdk_worker.py    (original)
>>>>>>> +++ apache_beam/runners/worker/sdk_worker.py    (reformatted)
>>>>>>> @@ -332,7 +332,7 @@
>>>>>>>    def _request_harness_monitoring_infos(self, request):
>>>>>>>      # type: (beam_fn_api_pb2.InstructionRequest) -> None
>>>>>>>      process_wide_monitoring_infos =
>>>>>>> MetricsEnvironment.process_wide_container(
>>>>>>> -        ).to_runner_api_monitoring_infos(None).values()
>>>>>>> +    ).to_runner_api_monitoring_infos(None).values()
>>>>>>>      self._execute(
>>>>>>>          lambda: beam_fn_api_pb2.InstructionResponse(
>>>>>>>              instruction_id=request.instruction_id,
>>>>>>> @@ -341,16 +341,17 @@
>>>>>>>                      monitoring_data={
>>>>>>>                          SHORT_ID_CACHE.getShortId(info):
>>>>>>> info.payload
>>>>>>>                          for info in process_wide_monitoring_infos
>>>>>>> -                    }))), request)
>>>>>>> +                    }))),
>>>>>>> +        request)
>>>>>>>
>>>>>>>    def _request_monitoring_infos(self, request):
>>>>>>>      # type: (beam_fn_api_pb2.InstructionRequest) -> None
>>>>>>>      self._execute(
>>>>>>>          lambda: beam_fn_api_pb2.InstructionResponse(
>>>>>>> -        instruction_id=request.instruction_id,
>>>>>>> -
>>>>>>>  monitoring_infos=beam_fn_api_pb2.MonitoringInfosMetadataResponse(
>>>>>>> -            monitoring_info=SHORT_ID_CACHE.getInfos(
>>>>>>> -                request.monitoring_infos.monitoring_info_id))),
>>>>>>> +            instruction_id=request.instruction_id,
>>>>>>> +
>>>>>>>  monitoring_infos=beam_fn_api_pb2.MonitoringInfosMetadataResponse(
>>>>>>> +                monitoring_info=SHORT_ID_CACHE.getInfos(
>>>>>>> +                    request.monitoring_infos.monitoring_info_id))),
>>>>>>>          request)
>>>>>>>
>>>>>>>    def _request_execute(self, request):
>>>>>>> --- apache_beam/metrics/execution.py    (original)
>>>>>>> +++ apache_beam/metrics/execution.py    (reformatted)
>>>>>>> @@ -150,7 +150,6 @@
>>>>>>>      return self.committed if self.committed else self.attempted
>>>>>>>
>>>>>>>
>>>>>>> -
>>>>>>>  class _MetricsEnvironment(object):
>>>>>>>    """Holds the MetricsContainer for every thread and other metric
>>>>>>> information.
>>>>>>>
>>>>>>> @@ -246,7 +245,6 @@
>>>>>>>
>>>>>>>    Or the metrics associated with the process/SDK harness. I.e.
>>>>>>> memory usage.
>>>>>>>    """
>>>>>>> -
>>>>>>>    def __init__(self, step_name):
>>>>>>>      self.step_name = step_name
>>>>>>>      self.metrics = dict()  # type: Dict[_TypedMetricName,
>>>>>>> MetricCell]
>>>>>>> @@ -315,7 +313,8 @@
>>>>>>>      """Returns a list of MonitoringInfos for the metrics in this
>>>>>>> container."""
>>>>>>>      all_metrics = [
>>>>>>>          cell.to_runner_api_monitoring_info(key.metric_name,
>>>>>>> transform_id)
>>>>>>> -        for key, cell in self.metrics.items()
>>>>>>> +        for key,
>>>>>>> +        cell in self.metrics.items()
>>>>>>>      ]
>>>>>>>      return {
>>>>>>>          monitoring_infos.to_key(mi): mi
>>>>>>> @@ -332,6 +331,7 @@
>>>>>>>
>>>>>>>
>>>>>>>  PROCESS_WIDE_METRICS_CONTAINER = MetricsContainer(None)
>>>>>>> +
>>>>>>>
>>>>>>>  class MetricUpdates(object):
>>>>>>>    """Contains updates for several metrics.
>>>>>>> --- apache_beam/runners/worker/sdk_worker_test.py       (original)
>>>>>>> +++ apache_beam/runners/worker/sdk_worker_test.py       (reformatted)
>>>>>>> @@ -51,8 +51,8 @@
>>>>>>>  from apache_beam.utils import thread_pool_executor
>>>>>>>  from apache_beam.utils.counters import CounterName
>>>>>>>
>>>>>>> -
>>>>>>>  _LOGGER = logging.getLogger(__name__)
>>>>>>> +
>>>>>>>
>>>>>>>  class
>>>>>>> BeamFnControlServicer(beam_fn_api_pb2_grpc.BeamFnControlServicer):
>>>>>>>    def __init__(self, requests, raise_errors=True):
>>>>>>> @@ -227,54 +227,42 @@
>>>>>>>    def test_harness_monitoring_infos_and_metadata(self):
>>>>>>>      # Create a process_wide metric.
>>>>>>>      urn = 'my.custom.urn'
>>>>>>> -    labels = {'key' : 'value'}
>>>>>>> +    labels = {'key': 'value'}
>>>>>>>      request_counter = InternalMetrics.counter(
>>>>>>>          urn=urn, labels=labels, process_wide=True).inc(10)
>>>>>>>
>>>>>>>      harness_monitoring_infos_request =
>>>>>>> beam_fn_api_pb2.InstructionRequest(
>>>>>>>          instruction_id="monitoring_infos",
>>>>>>> -
>>>>>>>  
>>>>>>> harness_monitoring_infos=beam_fn_api_pb2.HarnessMonitoringInfosRequest()
>>>>>>> -    )
>>>>>>> +
>>>>>>>  harness_monitoring_infos=beam_fn_api_pb2.HarnessMonitoringInfosRequest(
>>>>>>> +        ))
>>>>>>>
>>>>>>>      monitoring_infos_metadata_request =
>>>>>>> beam_fn_api_pb2.InstructionRequest(
>>>>>>>          instruction_id="monitoring_infos_metadata",
>>>>>>>
>>>>>>>  monitoring_infos=beam_fn_api_pb2.MonitoringInfosMetadataRequest(
>>>>>>> -            monitoring_info_id=['1']
>>>>>>> -        )
>>>>>>> -    )
>>>>>>> -
>>>>>>> -    responses = self.get_responses([
>>>>>>> -        harness_monitoring_infos_request,
>>>>>>> -        monitoring_infos_metadata_request
>>>>>>> -    ])
>>>>>>> +            monitoring_info_id=['1']))
>>>>>>> +
>>>>>>> +    responses = self.get_responses(
>>>>>>> +        [harness_monitoring_infos_request,
>>>>>>> monitoring_infos_metadata_request])
>>>>>>>
>>>>>>>      expected_monitoring_info = monitoring_infos.int64_counter(
>>>>>>>          urn, 10, labels=labels)
>>>>>>> -    expected_monitoring_data = {
>>>>>>> -        '1' : expected_monitoring_info.payload
>>>>>>> -    }
>>>>>>> -    self.assertEqual(responses['monitoring_infos'],
>>>>>>> +    expected_monitoring_data = {'1':
>>>>>>> expected_monitoring_info.payload}
>>>>>>> +    self.assertEqual(
>>>>>>> +        responses['monitoring_infos'],
>>>>>>>          beam_fn_api_pb2.InstructionResponse(
>>>>>>>              instruction_id='monitoring_infos',
>>>>>>>              harness_monitoring_infos=(
>>>>>>>                  beam_fn_api_pb2.HarnessMonitoringInfosResponse(
>>>>>>> -                    monitoring_data=expected_monitoring_data)
>>>>>>> -            )
>>>>>>> -        )
>>>>>>> -    )
>>>>>>> +                    monitoring_data=expected_monitoring_data))))
>>>>>>>
>>>>>>>      expected_monitoring_info.ClearField("payload")
>>>>>>> -    expected_monitoring_infos = {
>>>>>>> -        '1' : expected_monitoring_info
>>>>>>> -    }
>>>>>>> -    self.assertEqual(responses['monitoring_infos_metadata'],
>>>>>>> +    expected_monitoring_infos = {'1': expected_monitoring_info}
>>>>>>> +    self.assertEqual(
>>>>>>> +        responses['monitoring_infos_metadata'],
>>>>>>>          beam_fn_api_pb2.InstructionResponse(
>>>>>>>              instruction_id='monitoring_infos_metadata',
>>>>>>>
>>>>>>>  monitoring_infos=beam_fn_api_pb2.MonitoringInfosMetadataResponse(
>>>>>>> -                monitoring_info=expected_monitoring_infos
>>>>>>> -            )
>>>>>>> -        )
>>>>>>> -    )
>>>>>>> +                monitoring_info=expected_monitoring_infos)))
>>>>>>>
>>>>>>>    def
>>>>>>> test_failed_bundle_processor_returns_failed_split_response(self):
>>>>>>>      bundle_processor = mock.MagicMock()
>>>>>>> --- apache_beam/metrics/metricbase.py   (original)
>>>>>>> +++ apache_beam/metrics/metricbase.py   (reformatted)
>>>>>>> @@ -76,8 +76,9 @@
>>>>>>>      self.labels = labels if labels else {}
>>>>>>>
>>>>>>>    def __eq__(self, other):
>>>>>>> -    return (self.namespace == other.namespace and self.name ==
>>>>>>> other.name and
>>>>>>> -            self.urn == other.urn and self.labels == other.labels)
>>>>>>> +    return (
>>>>>>> +        self.namespace == other.namespace and self.name ==
>>>>>>> other.name and
>>>>>>> +        self.urn == other.urn and self.labels == other.labels)
>>>>>>>
>>>>>>>    def __ne__(self, other):
>>>>>>>      # TODO(BEAM-5949): Needed for Python 2 compatibility.
>>>>>>> --- apache_beam/metrics/monitoring_infos.py     (original)
>>>>>>> +++ apache_beam/metrics/monitoring_infos.py     (reformatted)
>>>>>>> @@ -98,6 +98,7 @@
>>>>>>>  BIGQUERY_QUERY_NAME_LABEL = (
>>>>>>>
>>>>>>> common_urns.monitoring_info_labels.BIGQUERY_QUERY_NAME.label_props.name
>>>>>>> )
>>>>>>>
>>>>>>> +
>>>>>>>  def extract_counter_value(monitoring_info_proto):
>>>>>>>    """Returns the counter value of the monitoring info."""
>>>>>>>    if not is_counter(monitoring_info_proto):
>>>>>>> @@ -178,8 +179,7 @@
>>>>>>>      pcollection: The pcollection id used as a label.
>>>>>>>    """
>>>>>>>    labels = labels or dict()
>>>>>>> -  labels.update(create_labels(
>>>>>>> -      ptransform=ptransform, pcollection=pcollection))
>>>>>>> +  labels.update(create_labels(ptransform=ptransform,
>>>>>>> pcollection=pcollection))
>>>>>>>    if isinstance(metric, int):
>>>>>>>      metric = coders.VarIntCoder().encode(metric)
>>>>>>>    return create_monitoring_info(urn, SUM_INT64_TYPE, metric, labels)
>>>>>>> --- apache_beam/metrics/monitoring_infos_test.py        (original)
>>>>>>> +++ apache_beam/metrics/monitoring_infos_test.py        (reformatted)
>>>>>>> @@ -93,18 +93,20 @@
>>>>>>>      expected_labels[monitoring_infos.SERVICE_LABEL] = "BigQuery"
>>>>>>>
>>>>>>>      labels = {
>>>>>>> -        monitoring_infos.SERVICE_LABEL:
>>>>>>> -            "BigQuery",
>>>>>>> +        monitoring_infos.SERVICE_LABEL: "BigQuery",
>>>>>>>      }
>>>>>>>      metric = CounterCell().get_cumulative()
>>>>>>>      result = monitoring_infos.int64_counter(
>>>>>>> -        monitoring_infos.API_REQUEST_COUNT_URN, metric,
>>>>>>> -        ptransform="ptransformname", pcollection="collectionname",
>>>>>>> +        monitoring_infos.API_REQUEST_COUNT_URN,
>>>>>>> +        metric,
>>>>>>> +        ptransform="ptransformname",
>>>>>>> +        pcollection="collectionname",
>>>>>>>          labels=labels)
>>>>>>>      counter_value = monitoring_infos.extract_counter_value(result)
>>>>>>>
>>>>>>>      self.assertEqual(0, counter_value)
>>>>>>>      self.assertEqual(result.labels, expected_labels)
>>>>>>>
>>>>>>> +
>>>>>>>  if __name__ == '__main__':
>>>>>>>    unittest.main()
>>>>>>> --- apache_beam/metrics/metric.py       (original)
>>>>>>> +++ apache_beam/metrics/metric.py       (reformatted)
>>>>>>> @@ -123,7 +123,6 @@
>>>>>>>
>>>>>>>    class DelegatingCounter(Counter):
>>>>>>>      """Metrics Counter that Delegates functionality to
>>>>>>> MetricsEnvironment."""
>>>>>>> -
>>>>>>>      def __init__(self, metric_name, process_wide=False):
>>>>>>>        # type: (MetricName, bool) -> None
>>>>>>>        super(Metrics.DelegatingCounter, self).__init__(metric_name)
>>>>>>> Command exited with non-zero status 1
>>>>>>> 240.82user 1.67system 0:25.33elapsed 957%CPU (0avgtext+0avgdata
>>>>>>> 63140maxresident)k
>>>>>>> 0inputs+0outputs (0major+217235minor)pagefaults 0swaps
>>>>>>> ERROR: InvocationError for command /usr/bin/time yapf --diff
>>>>>>> --parallel --recursive apache_beam (exited with code 1)
>>>>>>> py3-yapf-check run-test-post: commands[0] | bash
>>>>>>> /usr/local/google/home/ajamato/beam/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh
>>>>>>> ___________________________________ summary
>>>>>>> ____________________________________
>>>>>>> ERROR:   py3-yapf-check: commands failed
>>>>>>>
>>>>>>> > Task :sdks:python:test-suites:tox:py38:formatter FAILED
>>>>>>>
>>>>>>> FAILURE: Build failed with an exception.
>>>>>>>
>>>>>>> * What went wrong:
>>>>>>> Execution failed for task
>>>>>>> ':sdks:python:test-suites:tox:py38:formatter'.
>>>>>>> > Process 'command 'sh'' finished with non-zero exit value 1
>>>>>>>
>>>>>>> * Try:
>>>>>>> Run with --stacktrace option to get the stack trace. Run with --info
>>>>>>> or --debug option to get more log output. Run with --scan to get full
>>>>>>> insights.
>>>>>>>
>>>>>>> * Get more help at https://help.gradle.org
>>>>>>>
>>>>>>> Deprecated Gradle features were used in this build, making it
>>>>>>> incompatible with Gradle 7.0.
>>>>>>> Use '--warning-mode all' to show the individual deprecation warnings.
>>>>>>> See
>>>>>>> https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings
>>>>>>>
>>>>>>> BUILD FAILED in 1m 10s
>>>>>>> 4 actionable tasks: 1 executed, 3 up-to-date
>>>>>>>
>>>>>>>

Reply via email to