[ 
https://issues.apache.org/jira/browse/BEAM-2264?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16884150#comment-16884150
 ] 

Udi Meiri commented on BEAM-2264:
---------------------------------

Any solution here would need careful testing to verify that existing uses of 
passing credentials to Beam are not broken.
Typically Beam uses GCE service account credentials, but there are other 
methods when running locally.

Solutions:
1. Use OAuth2Credentials objects, which are compatible with 
sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py 
(they have the authorize() method).
The code in internal/gcp/auth.py currently has a custom subclass: 
_GCEMetadataCredentials(OAuth2Credentials), with it's own _refresh() method 
(refreshes credential token). It is not thread safe.
- It should be possible to use OAuth2Credentials' own refresh implementation, 
which is thread-safe (the self.store object is synchronized. see also this 
note: 
https://github.com/googleapis/google-api-python-client/blob/5c11b0a1b2658b26fe41b13ebd2e9e7b53c1ab01/docs/thread_safety.md#credential-storage-objects-are-thread-safe).
 This will allow to use just one credential object during process lifetime 
instead of the current method of one per API call.
- The only possible regression is the loss of the custom retry decorator in the 
Beam _refresh implementation. More investigation is needed about this, but I 
suspect that there's a default retry and it should be sufficient.

2. Use google.auth.credentials.Credentials objects, which are not compatible 
with the above storage_v1_client.py (and probably other similarly generated 
clients). This would require migrating to newer style clients like 
google-cloud-storage.

The advantage is simpler code.

Example code that uses google.auth.credentials.Credentials and caches the 
credential object (caching should work for solution 1 - see notes there):
{code}
def get_service_credentials():
  """For internal use only; no backwards-compatibility guarantees.

  Get credentials to access Google services.

  Returns:
    A ``google.auth.credentials.Credentials`` object or None if credentials not
    found. Returned object is thread-safe.
  """
  return _Credentials.get_service_credentials()


class _Credentials(object):
  _credentials_lock = threading.Lock()
  _credentials_init = False
  _credentials = None

  @classmethod
  def get_service_credentials(cls):
    if cls._credentials_init:
      return cls._credentials

    client_scopes = [
      'https://www.googleapis.com/auth/bigquery',
      'https://www.googleapis.com/auth/cloud-platform',
      'https://www.googleapis.com/auth/devstorage.full_control',
      'https://www.googleapis.com/auth/userinfo.email',
      'https://www.googleapis.com/auth/datastore'
    ]
    try:
      with cls._credentials_lock:
        if cls._credentials_init:
          return cls._credentials
        cls._credentials, project_id = google.auth.default(client_scopes)
        if is_running_in_gce:
          assert project_id == executing_project
        cls._credentials_init = True
        # TODO: remove?
        logging.info('Got credentials %r for project: %s', cls._credentials, 
project_id)
    except google.auth.exceptions.DefaultCredentialsError as e:
      logging.warning(
          'Unable to find default credentials to use: %s\n'
          'Connecting anonymously.', e)
      return None
    return cls._credentials
{code}

> Re-use credential instead of generating a new one one each GCS call
> -------------------------------------------------------------------
>
>                 Key: BEAM-2264
>                 URL: https://issues.apache.org/jira/browse/BEAM-2264
>             Project: Beam
>          Issue Type: Improvement
>          Components: sdk-py-core
>            Reporter: Luke Cwik
>            Priority: Minor
>          Time Spent: 50m
>  Remaining Estimate: 0h
>
> We should cache the credential used within a Pipeline and re-use it instead 
> of generating a new one on each GCS call. When executing (against 2.0.0 RC2):
> {code}
> python -m apache_beam.examples.wordcount --input 
> "gs://dataflow-samples/shakespeare/*" --output local_counts
> {code}
> Note that we seemingly generate a new access token each time instead of when 
> a refresh is required.
> {code}
>   super(GcsIO, cls).__new__(cls, storage_client))
> INFO:root:Starting the size estimation of the input
> INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
> INFO:oauth2client.client:Refreshing access_token
> INFO:root:Finished the size estimation of the input at 1 files. Estimation 
> took 0.286200046539 seconds
> INFO:root:Running pipeline with DirectRunner.
> INFO:root:Starting the size estimation of the input
> INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
> INFO:oauth2client.client:Refreshing access_token
> INFO:root:Finished the size estimation of the input at 43 files. Estimation 
> took 0.205624818802 seconds
> INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
> INFO:oauth2client.client:Refreshing access_token
> INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
> INFO:oauth2client.client:Refreshing access_token
> INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
> INFO:oauth2client.client:Refreshing access_token
> INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
> INFO:oauth2client.client:Refreshing access_token
> INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
> ... many more times ...
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

Reply via email to