Hello community, here is the log from the commit of package python-apache-libcloud for openSUSE:Factory checked in at 2020-03-08 22:22:34 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Comparing /work/SRC/openSUSE:Factory/python-apache-libcloud (Old) and /work/SRC/openSUSE:Factory/.python-apache-libcloud.new.26092 (New) ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-apache-libcloud" Sun Mar 8 22:22:34 2020 rev:30 rq:782015 version:2.8.1 Changes: -------- --- /work/SRC/openSUSE:Factory/python-apache-libcloud/python-apache-libcloud.changes 2020-02-25 16:01:10.723913857 +0100 +++ /work/SRC/openSUSE:Factory/.python-apache-libcloud.new.26092/python-apache-libcloud.changes 2020-03-08 22:22:40.556023355 +0100 @@ -1,0 +2,7 @@ +Thu Mar 5 18:45:41 UTC 2020 - Niels Abspoel <abo...@gmail.com> + +- update to 2.8.1 + for the changelog see: + https://libcloud.readthedocs.io/en/stable/changelog.html#changes-in-apache-libcloud-v2-8-1 + +------------------------------------------------------------------- @@ -5,0 +13,5 @@ + +------------------------------------------------------------------- +Thu Feb 20 18:19:46 UTC 2020 - James Fehlig <jfeh...@suse.com> + +- Stop building for python2 Old: ---- apache-libcloud-2.8.0.tar.gz New: ---- apache-libcloud-2.8.1.tar.gz ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Other differences: ------------------ ++++++ python-apache-libcloud.spec ++++++ --- /var/tmp/diff_new_pack.og3aAF/_old 2020-03-08 22:22:41.192023747 +0100 +++ /var/tmp/diff_new_pack.og3aAF/_new 2020-03-08 22:22:41.196023750 +0100 @@ -16,9 +16,12 @@ # +# No longer build for python2 +%define skip_python2 1 + %{?!python_module:%define python_module() python-%{**} python3-%{**}} Name: python-apache-libcloud -Version: 2.8.0 +Version: 2.8.1 Release: 0 Summary: Abstraction over multiple cloud provider APIs License: Apache-2.0 @@ -41,7 +44,9 @@ BuildRequires: fdupes BuildRequires: python-backports.ssl_match_hostname BuildRequires: python-rpm-macros +%if ! 0%{?skip_python2} BuildRequires: python2 +%endif Requires: python-lxml Requires: python-requests Requires: python-typing @@ -78,8 +83,10 @@ %check # Skip ShellOutSSHClientTests tests which attempt to ssh to localhost +%if ! 0%{?skip_python2} python2 -m pytest -k 'not ShellOutSSHClientTests and \ not ElasticContainerDriverTestCase' +%endif # Note these two extra py3 failures are undesirable and should be fixed python3 -m pytest -k \ 'not test_consume_stderr_chunk_contains_part_of_multi_byte_utf8_character and \ ++++++ apache-libcloud-2.8.0.tar.gz -> apache-libcloud-2.8.1.tar.gz ++++++ diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/CHANGES.rst new/apache-libcloud-2.8.1/CHANGES.rst --- old/apache-libcloud-2.8.0/CHANGES.rst 2019-12-23 14:59:47.000000000 +0100 +++ new/apache-libcloud-2.8.1/CHANGES.rst 2020-02-29 22:24:53.000000000 +0100 @@ -1,6 +1,105 @@ Changelog ========= +Changes in Apache Libcloud v2.8.1 +--------------------------------- + +Common +~~~~~~~ + +- Fix ``LIBCLOUD_DEBUG_PRETTY_PRINT_RESPONSE`` functionality and make sure it + works correctly under Python 3 when ``response.read()`` function returns + unicode and not bytes. + + (GITHUB-1430) + [Tomaz Muraus] + +Compute +~~~~~~~ + +- [GCE] Fix ``list_nodes()`` method so it correctly handles pagination + and returns all the nodes if there are more than 500 nodes available + in total. + + Previously, only first 500 nodes were returned. + + Reported by @TheSushiChef. + (GITHUB-1409, GITHUB-1360) + [Tomaz Muraus] + +- Fix some incorrect type annotations in the base compute API. + + Reported by @dpeschman. + (GITHUB-1413) + [Tomaz Muraus] + +- [OpenStack] Fix error with getting node id in ``_to_floating_ip`` method + when region is not called ``nova``. + (GITHUB-1411, GITHUB-1412) + [Miguel Caballer - @micafer] + +- [EC2] Fix ``ex_userdata`` keyword argument in the ``create_node()`` method + being ignored / not working correctly. + + NOTE: This regression has been inadvertently introduced in v2.8.0. + (GITHUB-1426) + [Dan Chaffelson - @Chaffelson] + +- [EC2] Update ``create_volume`` method to automatically select first available + availability zone if one is not explicitly provided via ``location`` argument. + [Tomaz Muraus] + +Storage +~~~~~~~ + +- [AWS S3] Fix upload object code so uploaded data MD5 checksum check is not + performed at the end of the upload when AWS KMS server side encryption is + used. + + If AWS KMS server side object encryption is used, ETag header value in the + response doesn't contain data MD5 digest so we can't perform a checksum + check. + + Reported by Jonathan Harden - @jfharden. + (GITHUB-1401, GITHUB-1406) + [Tomaz Muraus - @Kami] + +- [Google Storage] Fix a bug when uploading an object would fail and result + in 401 "invalid signature" error when object mime type contained mixed + casing and when S3 Interoperability authentication method was used. + + Reported by Will Abson - wabson. + (GITHUB-1417, GITHUB-1418) + [Tomaz Muraus] + +- Fix ``upload_object_via_stream`` method so "Illegal seek" errors which + can arise when calculating iterator content hash are ignored. Those errors + likely indicate that the underlying file handle / iterator is a pipe which + doesn't support seek and that the error is not fatal and we should still + proceed. + + Reported by Per Buer - @perbu. + + (GITHUB-1424, GITHUB-1427) + [Tomaz Muraus] + +DNS +~~~ + +- [Gandi Live] Update the driver and make sure it matches the latest service / + API updates. + (GITHUB-1416) + [Ryan Lee - @zepheiryan] + +- [CloudFlare] Fix ``export_zone_to_bind_format`` method. + + Previously it threw an exception, because ``record.extra`` dictionary + didn't contain ``priority`` key. + + Reported by James Montgomery - @gh-jamesmontgomery. + (GITHUB-1428, GITHUB-1429) + [Tomaz Muraus] + Changes in Apache Libcloud v2.8.0 --------------------------------- diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/PKG-INFO new/apache-libcloud-2.8.1/PKG-INFO --- old/apache-libcloud-2.8.0/PKG-INFO 2019-12-23 21:46:12.000000000 +0100 +++ new/apache-libcloud-2.8.1/PKG-INFO 2020-02-29 23:10:06.000000000 +0100 @@ -1,6 +1,6 @@ Metadata-Version: 1.2 Name: apache-libcloud -Version: 2.8.0 +Version: 2.8.1 Summary: A standard Python library that abstracts away differences among multiple cloud provider APIs. For more information and documentation, please see http://libcloud.apache.org Home-page: http://libcloud.apache.org/ Author: Apache Software Foundation diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/apache_libcloud.egg-info/PKG-INFO new/apache-libcloud-2.8.1/apache_libcloud.egg-info/PKG-INFO --- old/apache-libcloud-2.8.0/apache_libcloud.egg-info/PKG-INFO 2019-12-23 21:46:12.000000000 +0100 +++ new/apache-libcloud-2.8.1/apache_libcloud.egg-info/PKG-INFO 2020-02-29 23:10:05.000000000 +0100 @@ -1,6 +1,6 @@ Metadata-Version: 1.2 Name: apache-libcloud -Version: 2.8.0 +Version: 2.8.1 Summary: A standard Python library that abstracts away differences among multiple cloud provider APIs. For more information and documentation, please see http://libcloud.apache.org Home-page: http://libcloud.apache.org/ Author: Apache Software Foundation diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/apache_libcloud.egg-info/SOURCES.txt new/apache-libcloud-2.8.1/apache_libcloud.egg-info/SOURCES.txt --- old/apache-libcloud-2.8.0/apache_libcloud.egg-info/SOURCES.txt 2019-12-23 21:46:12.000000000 +0100 +++ new/apache-libcloud-2.8.1/apache_libcloud.egg-info/SOURCES.txt 2020-02-29 23:10:05.000000000 +0100 @@ -5,6 +5,7 @@ NOTICE README.rst example_compute.py +example_container.py example_dns.py example_loadbalancer.py example_storage.py @@ -2163,8 +2164,6 @@ libcloud/test/dns/fixtures/gandi_live/create_existing_record.json libcloud/test/dns/fixtures/gandi_live/create_record.json libcloud/test/dns/fixtures/gandi_live/create_zone.json -libcloud/test/dns/fixtures/gandi_live/delete_gandi_zone.json -libcloud/test/dns/fixtures/gandi_live/delete_record.json libcloud/test/dns/fixtures/gandi_live/get_bad_zone.json libcloud/test/dns/fixtures/gandi_live/get_mx_record.json libcloud/test/dns/fixtures/gandi_live/get_nonexistent_record.json diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/example_compute.py new/apache-libcloud-2.8.1/example_compute.py --- old/apache-libcloud-2.8.0/example_compute.py 2019-12-18 22:26:03.000000000 +0100 +++ new/apache-libcloud-2.8.1/example_compute.py 2020-02-28 23:54:12.000000000 +0100 @@ -21,28 +21,13 @@ from typing import Type, cast -ec2_cls = get_driver(Provider.EC2) -rackspace_cls = get_driver(Provider.RACKSPACE) +cls = get_driver(Provider.KUBEVIRT) -# NOTE: If you are using driver methods which are not part of the standard API, -# you need to explicitly cast the driver class reference to the correct class -# for type checking to work correctly -EC2 = cast(Type[EC2NodeDriver], ec2_cls) -Rackspace = cast(Type[RackspaceNodeDriver], rackspace_cls) +conn = cls(host='192.168.99.103', + port=8443, + secure=True, + key_file='/home/kami/.minikube/client.key', + cert_file='/home/kami/.minikube/client.crt', + ca_cert='/home/kami/.minikube/ca.crt') +print(conn.list_nodes()) -drivers = [EC2('access key id', 'secret key', region='us-east-1'), - Rackspace('username', 'api key', region='iad')] - -nodes = [] -for driver in drivers: - nodes.extend(driver.list_nodes()) - -print(nodes) -# [ <Node: provider=Amazon, status=RUNNING, name=bob, ip=1.2.3.4.5>, -# <Node: provider=Rackspace, status=REBOOT, name=korine, ip=6.7.8.9.10>, ... ] - -# grab the node named "test" -node = [n for n in nodes if n.name == 'test'][0] - -# reboot "test" -node.reboot() diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/example_container.py new/apache-libcloud-2.8.1/example_container.py --- old/apache-libcloud-2.8.0/example_container.py 1970-01-01 01:00:00.000000000 +0100 +++ new/apache-libcloud-2.8.1/example_container.py 2020-02-29 19:13:54.000000000 +0100 @@ -0,0 +1,30 @@ +# Licensed to the Apache Software Foundation (ASF) under one or more +# contributor license agreements. See the NOTICE file distributed with +# this work for additional information regarding copyright ownership. +# The ASF licenses this file to You under the Apache License, Version 2.0 +# (the "License"); you may not use this file except in compliance with +# the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from libcloud.container.types import Provider +from libcloud.container.providers import get_driver + +cls = get_driver(Provider.GKE) + +# You can retrieve cluster ip by running "minikube ip" command +conn = cls('libcloud-t...@api-project-767966281678.iam.gserviceaccount.com', + '/home/kami/Downloads/api-project-767966281678-b44d02952d31.json', + project='767966281678') + +for cluster in conn.list_clusters(): + print(cluster.name) + +for container in conn.list_containers(): + print(container.name) diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/example_dns.py new/apache-libcloud-2.8.1/example_dns.py --- old/apache-libcloud-2.8.0/example_dns.py 2019-09-09 18:10:00.000000000 +0200 +++ new/apache-libcloud-2.8.1/example_dns.py 2020-02-28 23:54:12.000000000 +0100 @@ -18,12 +18,14 @@ from libcloud.dns.types import Provider from libcloud.dns.providers import get_driver -Zerigo = get_driver(Provider.ZERIGO) +Zerigo = get_driver(Provider.CLOUDFLARE) -driver = Zerigo('email', 'key') +driver = Zerigo('to...@tomaz.me', 'bae540b356fbf88ddb364875c9bb3ef4ab303') zones = driver.list_zones() pprint(zones) records = zones[0].list_records() pprint(records) + +print(zones[0].export_to_bind_format()) diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/example_storage.py new/apache-libcloud-2.8.1/example_storage.py --- old/apache-libcloud-2.8.0/example_storage.py 2019-11-20 13:52:50.000000000 +0100 +++ new/apache-libcloud-2.8.1/example_storage.py 2020-02-29 18:01:17.000000000 +0100 @@ -13,17 +13,61 @@ # See the License for the specific language governing permissions and # limitations under the License. +from io import BytesIO from pprint import pprint +import sys from libcloud.storage.types import Provider from libcloud.storage.providers import get_driver -CloudFiles = get_driver(Provider.CLOUDFILES) -driver = CloudFiles('access key id', 'secret key', region='ord') +Driver = get_driver(Provider.GOOGLE_STORAGE) +driver = Driver('libcloud-t...@api-project-767966281678.iam.gserviceaccount.com', + '/home/kami/Downloads/api-project-767966281678-b44d02952d31.json', + project='api-project-767966281678') + +Driver = get_driver(Provider.AZURE_BLOBS) +driver = Driver('libclouddevblobs', + 'CWNVu69mq/9HUX7+hLNEWPulX4/45KLYN306CpW0BBccV4Ot6JyPxXsHRxK+wGENCYMf97NqPYUEA0nUtnDnqg==') + + +Driver = get_driver(Provider.CLOUDFILES) +driver = Driver('kamislo', + 'ad514c7eb8a55dfefecc6a1a1770aa47', + region='ord') + + +#driver = Driver('GOOGC7RCLUYGL3IUBRNW', 'kjZ0t1VCFIz2zOCJXEv532mG4mlTZIg2NWd4Mrat') containers = driver.list_containers() +container = containers[0] container_objects = driver.list_container_objects(containers[0]) -pprint(containers) -pprint(container_objects) +iterator = BytesIO(b'0123456789') + + +obj = driver.upload_object_via_stream(iterator=BytesIO(b'0123456789'), container=container, object_name='test1.xlsm') +# extra={'content_type': 'application/vnd.ms-excel.sheet.macroenabled.12'}) + +print(driver.download_object_range(obj=obj, destination_path='1.obj', start_bytes=5, end_bytes=None, overwrite_existing=True)) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=0, end_bytes=1))) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=0, end_bytes=2))) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=0, end_bytes=3))) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=5, end_bytes=8))) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=5))) + +print('====') +sys.exit(1) + +driver = get_driver(Provider.LOCAL)('.') + +containers = driver.list_containers() +container = containers[0] +obj = driver.upload_object_via_stream(iterator=iterator, container=container, object_name='test1.xlsm') +print(driver.download_object_range(obj=obj, destination_path='3.obj', start_bytes=0, end_bytes=6, overwrite_existing=True)) +sys.exit(1) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=0, end_bytes=1))) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=0, end_bytes=2))) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=5, end_bytes=8))) +print(next(driver.download_object_range_as_stream(obj=obj, start_bytes=5))) + diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/__init__.py new/apache-libcloud-2.8.1/libcloud/__init__.py --- old/apache-libcloud-2.8.0/libcloud/__init__.py 2019-12-23 18:51:49.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/__init__.py 2020-02-29 22:38:45.000000000 +0100 @@ -46,7 +46,7 @@ 'enable_debug' ] -__version__ = '2.8.0' +__version__ = '2.8.1' def enable_debug(fo): diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/common/gandi_live.py new/apache-libcloud-2.8.1/libcloud/common/gandi_live.py --- old/apache-libcloud-2.8.0/libcloud/common/gandi_live.py 2019-11-29 22:46:05.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/common/gandi_live.py 2020-02-29 22:24:24.000000000 +0100 @@ -139,13 +139,20 @@ valid_http_codes = [ httplib.OK, httplib.CREATED, - httplib.NO_CONTENT ] if self.status in valid_http_codes: if json_error: raise JsonParseError(body, self.status) else: return body + elif self.status == httplib.NO_CONTENT: + # Parse error for empty body is acceptable, but a non-empty body + # is not. + if len(body) > 0: + msg = '"No Content" response contained content' + raise GandiLiveBaseError(msg, self.status) + else: + return {} elif self.status == httplib.NOT_FOUND: message = self._get_error(body, json_error) raise ResourceNotFoundError(message, self.status) diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/compute/base.py new/apache-libcloud-2.8.1/libcloud/compute/base.py --- old/apache-libcloud-2.8.0/libcloud/compute/base.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/compute/base.py 2020-02-29 21:56:39.000000000 +0100 @@ -350,7 +350,7 @@ disk, # type: int bandwidth, # type: Optional[int] price, # type: float - driver, # type: Type[NodeDriver] + driver, # type: NodeDriver extra=None # type: Optional[dict] ): """ @@ -421,7 +421,7 @@ def __init__(self, id, # type: str name, # type: str - driver, # type: Type[NodeDriver] + driver, # type: NodeDriver extra=None # type: Optional[dict] ): """ @@ -467,7 +467,7 @@ id, # type: str image_id, # type: str state, # type: NodeImageMemberState - driver, # type: Type[NodeDriver] + driver, # type: NodeDriver created=None, # type: datetime.datetime extra=None # type: Optional[dict] ): @@ -522,7 +522,7 @@ id, # type: str name, # type: str country, # type: str - driver, # type: Type[NodeDriver] + driver, # type: NodeDriver extra=None # type: Optional[dict] ): """ @@ -822,7 +822,7 @@ name = None # type: str api_name = None # type: str website = None # type: str - type = None # type: Provider + type = None # type: Union[Provider,str] port = None # type: int features = {'create_node': []} # type: Dict[str, List[str]] diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/compute/drivers/ec2.py new/apache-libcloud-2.8.1/libcloud/compute/drivers/ec2.py --- old/apache-libcloud-2.8.0/libcloud/compute/drivers/ec2.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/compute/drivers/ec2.py 2020-02-29 23:01:07.000000000 +0100 @@ -1954,7 +1954,7 @@ params['KeyName'] = ex_keyname if ex_userdata: - params['UserData'] = base64.b64encode(b('ex_userdata'))\ + params['UserData'] = base64.b64encode(b(ex_userdata))\ .decode('utf-8') if ex_clienttoken: diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/compute/drivers/gce.py new/apache-libcloud-2.8.1/libcloud/compute/drivers/gce.py --- old/apache-libcloud-2.8.0/libcloud/compute/drivers/gce.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/compute/drivers/gce.py 2020-02-29 21:57:57.000000000 +0100 @@ -21,6 +21,7 @@ import datetime import time +import itertools import sys from libcloud.common.base import LazyObject @@ -133,7 +134,7 @@ return response - def request_aggregated_items(self, api_name): + def request_aggregated_items(self, api_name, zone=None): """ Perform request(s) to obtain all results from 'api_name'. @@ -145,12 +146,19 @@ for valid names. :type api_name: ``str`` + :param zone: Optional zone to use. + :type zone: :class:`GCEZone` + :return: dict in the format of the API response. format: { 'items': {'key': {api_name: []}} } ex: { 'items': {'zones/us-central1-a': {disks: []}} } :rtype: ``dict`` """ - request_path = "/aggregated/%s" % api_name + if zone: + request_path = "/zones/%s/%s" % (zone.name, api_name) + else: + request_path = "/aggregated/%s" % (api_name) + api_responses = [] params = {'maxResults': 500} @@ -159,6 +167,15 @@ self.gce_params = params response = self.request(request_path, method='GET').object if 'items' in response: + if zone: + # Special case when we are handling pagination for a + # specific zone + items = response['items'] + response['items'] = { + 'zones/%s' % (zone): { + api_name: items + } + } api_responses.append(response) more_results = 'pageToken' in params return self._merge_response_items(api_name, api_responses) @@ -2569,50 +2586,39 @@ :return: List of Node objects :rtype: ``list`` of :class:`Node` """ - list_nodes = [] zone = self._set_zone(ex_zone) - if zone is None: - request = '/aggregated/instances' - else: - request = '/zones/%s/instances' % (zone.name) - response = self.connection.request(request, method='GET').object + response = self.connection.request_aggregated_items('instances', + zone=zone) + + if not response.get('items', []): + return [] + + list_nodes = [] + + # The aggregated response returns a dict for each zone + # Create volume cache now for fast lookups of disk info. + self._ex_populate_volume_dict() + + items = response['items'].values() + instances = [item.get('instances', []) for item in items] + instances = itertools.chain(*instances) + + for instance in instances: + try: + node = self._to_node(instance, + use_disk_cache=ex_use_disk_cache) + except ResourceNotFoundError: + # If a GCE node has been deleted between + # - is was listed by `request('.../instances', 'GET') + # - it is converted by `self._to_node(i)` + # `_to_node()` will raise a ResourceNotFoundError. + # + # Just ignore that node and return the list of the + # other nodes. + continue + + list_nodes.append(node) - if 'items' in response: - # The aggregated response returns a dict for each zone - if zone is None: - # Create volume cache now for fast lookups of disk info. - self._ex_populate_volume_dict() - for v in response['items'].values(): - for i in v.get('instances', []): - try: - list_nodes.append( - self._to_node(i, - use_disk_cache=ex_use_disk_cache) - ) - # If a GCE node has been deleted between - # - is was listed by `request('.../instances', 'GET') - # - it is converted by `self._to_node(i)` - # `_to_node()` will raise a ResourceNotFoundError. - # - # Just ignore that node and return the list of the - # other nodes. - except ResourceNotFoundError: - pass - else: - for i in response['items']: - try: - list_nodes.append( - self._to_node(i, use_disk_cache=ex_use_disk_cache) - ) - # If a GCE node has been deleted between - # - is was listed by `request('.../instances', 'GET') - # - it is converted by `self._to_node(i)` - # `_to_node()` will raise a ResourceNotFoundError. - # - # Just ignore that node and return the list of the - # other nodes. - except ResourceNotFoundError: - pass # Clear the volume cache as lookups are complete. self._ex_volume_dict = {} return list_nodes diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/compute/drivers/openstack.py new/apache-libcloud-2.8.1/libcloud/compute/drivers/openstack.py --- old/apache-libcloud-2.8.0/libcloud/compute/drivers/openstack.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/compute/drivers/openstack.py 2020-02-29 21:56:42.000000000 +0100 @@ -3927,8 +3927,8 @@ port.extra["mac_address"]} if 'port_details' in obj and obj['port_details']: - if obj['port_details']['device_owner'] in ['compute:nova', - 'compute:None']: + dev_owner = obj['port_details']['device_owner'] + if dev_owner and dev_owner.startswith("compute:"): instance_id = obj['port_details']['device_id'] ip_address = obj['floating_ip_address'] diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/dns/drivers/cloudflare.py new/apache-libcloud-2.8.1/libcloud/dns/drivers/cloudflare.py --- old/apache-libcloud-2.8.0/libcloud/dns/drivers/cloudflare.py 2019-11-29 22:46:05.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/dns/drivers/cloudflare.py 2020-02-29 21:53:25.000000000 +0100 @@ -74,6 +74,7 @@ 'created_on', 'modified_on', 'data', + 'priority' } RECORD_CREATE_ATTRIBUTES = { diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/storage/base.py new/apache-libcloud-2.8.1/libcloud/storage/base.py --- old/apache-libcloud-2.8.0/libcloud/storage/base.py 2019-12-18 22:26:03.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/storage/base.py 2020-02-29 21:52:41.000000000 +0100 @@ -24,6 +24,7 @@ import os.path # pylint: disable-msg=W0404 import hashlib +import errno from os.path import join as pjoin from libcloud.utils.py3 import httplib @@ -653,7 +654,20 @@ # Ensure we start from the begining of a stream in case stream is # not at the beginning if hasattr(stream, 'seek'): - stream.seek(0) + try: + stream.seek(0) + except OSError as e: + if e.errno != errno.ESPIPE: + # This represents "OSError: [Errno 29] Illegal seek" + # error. This could either mean that the underlying + # handle doesn't support seek operation (e.g. pipe) or + # that the invalid seek position is provided. Sadly + # there is no good robust way to distinghuish that so + # we simply ignore all the "Illeal seek" errors so + # this function works correctly with pipes. + # See https://github.com/apache/libcloud/pull/1427 for + # details + raise e for chunk in libcloud.utils.files.read_in_chunks(iterator=stream): hasher.update(b(chunk)) diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/storage/drivers/google_storage.py new/apache-libcloud-2.8.1/libcloud/storage/drivers/google_storage.py --- old/apache-libcloud-2.8.0/libcloud/storage/drivers/google_storage.py 2019-09-09 18:10:00.000000000 +0200 +++ new/apache-libcloud-2.8.1/libcloud/storage/drivers/google_storage.py 2020-02-29 21:50:42.000000000 +0100 @@ -119,7 +119,9 @@ # Lowercase all headers except 'date' and Google header values for k, v in headers.items(): k_lower = k.lower() - if (k_lower == 'date' or k_lower.startswith( + # NOTE: It's important that the value of Content-Type header is + # left as is and not lowercased + if (k_lower in ['date', 'content-type'] or k_lower.startswith( GoogleStorageDriver.http_vendor_prefix) or not isinstance(v, str)): headers_copy[k_lower] = v diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/storage/drivers/s3.py new/apache-libcloud-2.8.1/libcloud/storage/drivers/s3.py --- old/apache-libcloud-2.8.0/libcloud/storage/drivers/s3.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/storage/drivers/s3.py 2020-02-29 21:58:56.000000000 +0100 @@ -846,8 +846,18 @@ headers = response.headers response = response server_hash = headers.get('etag', '').replace('"', '') + server_side_encryption = headers.get('x-amz-server-side-encryption', + None) + aws_kms_encryption = (server_side_encryption == 'aws:kms') + hash_matches = (result_dict['data_hash'] == server_hash) - if (verify_hash and result_dict['data_hash'] != server_hash): + # NOTE: If AWS KMS server side encryption is enabled, ETag won't + # contain object MD5 digest so we skip the checksum check + # See https://docs.aws.amazon.com/AmazonS3/latest/API + # /RESTCommonResponseHeaders.html + # and https://github.com/apache/libcloud/issues/1401 + # for details + if verify_hash and not aws_kms_encryption and not hash_matches: raise ObjectHashMismatchError( value='MD5 hash {0} checksum does not match {1}'.format( server_hash, result_dict['data_hash']), diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/compute/fixtures/openstack_v1.1/_v2_0__floatingips.json new/apache-libcloud-2.8.1/libcloud/test/compute/fixtures/openstack_v1.1/_v2_0__floatingips.json --- old/apache-libcloud-2.8.0/libcloud/test/compute/fixtures/openstack_v1.1/_v2_0__floatingips.json 2019-11-29 22:46:05.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/compute/fixtures/openstack_v1.1/_v2_0__floatingips.json 2020-02-29 21:56:42.000000000 +0100 @@ -75,6 +75,34 @@ }, "tags": ["tag3"], "port_forwardings": [] + }, + { + "router_id": "d23abc8d-2991-4a55-ba98-2aaea84cc72f", + "description": "for test", + "dns_domain": "my-domain.org.", + "dns_name": "myfip", + "created_at": "2016-12-21T10:55:50Z", + "updated_at": "2016-12-21T10:55:53Z", + "revision_number": 1, + "project_id": "4969c491a3c74ee4af974e6d800c62de", + "tenant_id": "4969c491a3c74ee4af974e6d800c62de", + "floating_network_id": "376da547-b977-4cfe-9cba-275c80debf57", + "fixed_ip_address": "10.0.0.4", + "floating_ip_address": "10.3.1.3", + "port_id": "ce705c24-c1ef-408a-bda3-7bbd946164ab", + "id": "123c5336a-0629-4694-ba30-04b0bdfa88a4", + "status": "ACTIVE", + "port_details": { + "status": "ACTIVE", + "name": "", + "admin_state_up": true, + "network_id": "02dd8479-ef26-4398-a102-d19d0a7b3a1f", + "device_owner": "compute:region", + "mac_address": "fa:16:3e:b1:3b:30", + "device_id": "cb4fba64-19e2-40fd-8497-f29da1b21143" + }, + "tags": ["tag3"], + "port_forwardings": [] } ] } \ No newline at end of file diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/compute/test_ec2.py new/apache-libcloud-2.8.1/libcloud/test/compute/test_ec2.py --- old/apache-libcloud-2.8.0/libcloud/test/compute/test_ec2.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/compute/test_ec2.py 2020-02-29 23:01:36.000000000 +0100 @@ -19,10 +19,13 @@ import os import sys +import base64 from datetime import datetime from libcloud.utils.iso8601 import UTC from libcloud.utils.py3 import httplib +from libcloud.utils.py3 import parse_qs +from libcloud.utils.py3 import b from libcloud.compute.drivers.ec2 import EC2NodeDriver from libcloud.compute.drivers.ec2 import EC2PlacementGroup @@ -1359,7 +1362,7 @@ region = 'sa-east-1' -class EC2MockHttp(MockHttp): +class EC2MockHttp(MockHttp, unittest.TestCase): fixtures = ComputeFileFixtures('ec2') def _DescribeInstances(self, method, url, body, headers): @@ -1426,6 +1429,17 @@ body = self.fixtures.load('run_instances.xml') return (httplib.OK, body, {}, httplib.responses[httplib.OK]) + def _ex_user_data_RunInstances(self, method, url, body, headers): + # test_create_node_with_ex_userdata + params = parse_qs(url.replace('/?', '')) + + self.assertTrue('UserData' in params) + user_data = base64.b64decode(b(params['UserData'][0])).decode('utf-8') + self.assertEqual(user_data, 'foo\nbar\foo') + + body = self.fixtures.load('run_instances.xml') + return (httplib.OK, body, {}, httplib.responses[httplib.OK]) + def _create_ex_assign_public_ip_RunInstances(self, method, url, body, headers): self.assertUrlContainsQueryParams(url, { 'NetworkInterface.1.AssociatePublicIpAddress': "true", @@ -2004,6 +2018,19 @@ image=image, size=size, ex_iamprofile='foo') + def test_create_node_with_ex_userdata(self): + EC2MockHttp.type = 'ex_user_data' + + image = NodeImage(id='ami-be3adfd7', + name=self.image_name, + driver=self.driver) + size = NodeSize('m1.small', 'Small Instance', None, None, None, None, + driver=self.driver) + + result = self.driver.create_node(name='foo', image=image, size=size, + ex_userdata='foo\nbar\foo') + self.assertTrue(result) + class FCUMockHttp(EC2MockHttp): fixtures = ComputeFileFixtures('fcu') diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/compute/test_gce.py new/apache-libcloud-2.8.1/libcloud/test/compute/test_gce.py --- old/apache-libcloud-2.8.0/libcloud/test/compute/test_gce.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/compute/test_gce.py 2020-02-29 21:57:57.000000000 +0100 @@ -635,9 +635,11 @@ nodes = self.driver.list_nodes() nodes_all = self.driver.list_nodes(ex_zone='all') nodes_uc1a = self.driver.list_nodes(ex_zone='us-central1-a') + nodes_uc1b = self.driver.list_nodes(ex_zone='us-central1-b') self.assertEqual(len(nodes), 1) self.assertEqual(len(nodes_all), 8) self.assertEqual(len(nodes_uc1a), 1) + self.assertEqual(len(nodes_uc1b), 0) self.assertEqual(nodes[0].name, 'node-name') self.assertEqual(nodes_uc1a[0].name, 'node-name') self.assertEqual(nodes_uc1a[0].extra['cpuPlatform'], 'Intel Skylake') @@ -3502,6 +3504,11 @@ body = self.fixtures.load('zones_europe-west1-a_instances.json') return (httplib.OK, body, self.json_hdr, httplib.responses[httplib.OK]) + def _zones_us_central1_b_instances(self, method, url, body, headers): + if method == 'GET': + body = '{}' + return (httplib.OK, body, self.json_hdr, httplib.responses[httplib.OK]) + def _zones_europe_west1_a_diskTypes_pd_standard(self, method, url, body, headers): body = self.fixtures.load( diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/compute/test_openstack.py new/apache-libcloud-2.8.1/libcloud/test/compute/test_openstack.py --- old/apache-libcloud-2.8.0/libcloud/test/compute/test_openstack.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/compute/test_openstack.py 2020-02-29 21:56:42.000000000 +0100 @@ -1442,6 +1442,11 @@ self.assertEqual(ret[2].ip_address, '10.3.1.2') self.assertEqual( ret[2].node_id, 'cb4fba64-19e2-40fd-8497-f29da1b21143') + self.assertEqual(ret[3].id, '123c5336a-0629-4694-ba30-04b0bdfa88a4') + self.assertEqual(ret[3].pool, pool) + self.assertEqual(ret[3].ip_address, '10.3.1.3') + self.assertEqual( + ret[3].node_id, 'cb4fba64-19e2-40fd-8497-f29da1b21143') def test_OpenStack_2_FloatingIpPool_get_floating_ip(self): pool = OpenStack_2_FloatingIpPool(1, 'foo', self.driver.connection) diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/dns/fixtures/cloudflare/records_GET_1.json new/apache-libcloud-2.8.1/libcloud/test/dns/fixtures/cloudflare/records_GET_1.json --- old/apache-libcloud-2.8.0/libcloud/test/dns/fixtures/cloudflare/records_GET_1.json 2019-11-29 22:46:05.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/dns/fixtures/cloudflare/records_GET_1.json 2020-02-29 21:53:25.000000000 +0100 @@ -94,6 +94,26 @@ "managed_by_apps": false, "managed_by_argo_tunnel": false } + }, + { + "content": "aspmx3.googlemail.com", + "created_on": "2015-09-04T23:06:50.625895Z", + "id": "78526", + "locked": false, + "meta": { + "auto_added": true, + "managed_by_apps": false, + "managed_by_argo_tunnel": false + }, + "modified_on": "2015-09-04T23:06:50.625895Z", + "name": "foo.bar", + "priority": 30, + "proxiable": false, + "proxied": false, + "ttl": 1, + "type": "MX", + "zone_id": "1234", + "zone_name": "foo.bar" } ], "result_info": { diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/dns/fixtures/gandi_live/delete_gandi_zone.json new/apache-libcloud-2.8.1/libcloud/test/dns/fixtures/gandi_live/delete_gandi_zone.json --- old/apache-libcloud-2.8.0/libcloud/test/dns/fixtures/gandi_live/delete_gandi_zone.json 2019-11-29 22:46:05.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/dns/fixtures/gandi_live/delete_gandi_zone.json 1970-01-01 01:00:00.000000000 +0100 @@ -1,3 +0,0 @@ -{ - "message": "Zone deleted" -} diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/dns/fixtures/gandi_live/delete_record.json new/apache-libcloud-2.8.1/libcloud/test/dns/fixtures/gandi_live/delete_record.json --- old/apache-libcloud-2.8.0/libcloud/test/dns/fixtures/gandi_live/delete_record.json 2019-11-29 22:46:05.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/dns/fixtures/gandi_live/delete_record.json 1970-01-01 01:00:00.000000000 +0100 @@ -1,3 +0,0 @@ -{ - "message": "Zone Record Deleted" -} diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/dns/test_cloudflare.py new/apache-libcloud-2.8.1/libcloud/test/dns/test_cloudflare.py --- old/apache-libcloud-2.8.0/libcloud/test/dns/test_cloudflare.py 2019-11-29 22:46:05.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/dns/test_cloudflare.py 2020-02-29 21:53:25.000000000 +0100 @@ -76,13 +76,14 @@ def test_list_records(self): zone = self.driver.list_zones()[0] records = self.driver.list_records(zone=zone) - self.assertEqual(len(records), 9) + self.assertEqual(len(records), 10) record = records[0] self.assertEqual(record.id, '364797364') self.assertIsNone(record.name) self.assertEqual(record.type, 'A') self.assertEqual(record.data, '192.30.252.153') + self.assertEqual(record.extra['priority'], None) for attribute_name in RECORD_EXTRA_ATTRIBUTES: self.assertTrue(attribute_name in record.extra) @@ -96,6 +97,13 @@ for attribute_name in RECORD_EXTRA_ATTRIBUTES: self.assertTrue(attribute_name in record.extra) + record = [r for r in records if r.type == 'MX'][0] + self.assertEqual(record.id, '78526') + self.assertIsNone(record.name) + self.assertEqual(record.type, 'MX') + self.assertEqual(record.data, 'aspmx3.googlemail.com') + self.assertEqual(record.extra['priority'], 30) + def test_get_zone(self): zone = self.driver.get_zone(zone_id='1234') self.assertEqual(zone.id, '1234') diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/dns/test_gandi_live.py new/apache-libcloud-2.8.1/libcloud/test/dns/test_gandi_live.py --- old/apache-libcloud-2.8.0/libcloud/test/dns/test_gandi_live.py 2019-11-29 22:46:05.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/dns/test_gandi_live.py 2020-02-29 22:24:24.000000000 +0100 @@ -269,8 +269,7 @@ return (httplib.OK, body, {}, httplib.responses[httplib.OK]) def _json_api_v5_zones_111111_delete(self, method, url, body, headers): - body = self.fixtures.load('delete_gandi_zone.json') - return (httplib.OK, body, {}, httplib.responses[httplib.OK]) + return (httplib.NO_CONTENT, '', {}, httplib.responses[httplib.OK]) def _json_api_v5_domains_example_org_patch(self, method, url, body, headers): @@ -347,8 +346,7 @@ def _json_api_v5_domains_example_com_records_bob_A_delete(self, method, url, body, headers): - body = self.fixtures.load('delete_record.json') - return (httplib.OK, body, {}, httplib.responses[httplib.OK]) + return (httplib.NO_CONTENT, '', {}, httplib.responses[httplib.OK]) if __name__ == '__main__': sys.exit(unittest.main()) diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/storage/test_base.py new/apache-libcloud-2.8.1/libcloud/test/storage/test_base.py --- old/apache-libcloud-2.8.0/libcloud/test/storage/test_base.py 2019-12-18 22:26:03.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/storage/test_base.py 2020-02-29 22:48:43.000000000 +0100 @@ -14,6 +14,7 @@ # limitations under the License. import sys +import errno import hashlib from libcloud.utils.py3 import httplib @@ -96,7 +97,6 @@ response_streamed = mock_response.request.stream assert response_streamed is False - def test__get_hash_function(self): self.driver1.hash_type = 'md5' func = self.driver1._get_hash_function() @@ -152,7 +152,8 @@ def test_upload_object_hash_calculation_is_efficient(self, mock_read_in_chunks, mock_exhaust_iterator): # Verify that we don't buffer whole file in memory when calculating - # object has when iterator has __next__ method, but instead read and calculate hash in chunks + # object has when iterator has __next__ method, but instead read and + # calculate hash in chunks size = 100 self.driver1.connection = Mock() @@ -228,6 +229,47 @@ self.assertEqual(mock_read_in_chunks.call_count, 2) self.assertEqual(mock_exhaust_iterator.call_count, 0) + def test_upload_object_via_stream_illegal_seek_errors_are_ignored(self): + # Illegal seek errors should be ignored + size = 100 + + self.driver1.connection = Mock() + + seek_error = OSError('Illegal seek') + seek_error.errno = 29 + assert errno.ESPIPE == 29 + + iterator = BodyStream('a' * size) + iterator.seek = mock.Mock(side_effect=seek_error) + + result = self.driver1._upload_object(object_name='test1', + content_type=None, + request_path='/', + stream=iterator) + + hasher = hashlib.md5() + hasher.update(b('a') * size) + expected_hash = hasher.hexdigest() + + self.assertEqual(result['data_hash'], expected_hash) + self.assertEqual(result['bytes_transferred'], size) + + # But others shouldn't + self.driver1.connection = Mock() + + seek_error = OSError('Other error') + seek_error.errno = 21 + + iterator = BodyStream('b' * size) + iterator.seek = mock.Mock(side_effect=seek_error) + + assertRaisesRegex(self, OSError, 'Other error', + self.driver1._upload_object, + object_name='test1', + content_type=None, + request_path='/', + stream=iterator) + if __name__ == '__main__': sys.exit(unittest.main()) diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/storage/test_google_storage.py new/apache-libcloud-2.8.1/libcloud/test/storage/test_google_storage.py --- old/apache-libcloud-2.8.0/libcloud/test/storage/test_google_storage.py 2019-09-09 18:10:00.000000000 +0200 +++ new/apache-libcloud-2.8.1/libcloud/test/storage/test_google_storage.py 2020-02-29 21:50:42.000000000 +0100 @@ -235,10 +235,12 @@ 'Date': TODAY, 'x-goog-foo': 'X-GOOG: MAINTAIN UPPERCASE!', 'x-Goog-bar': 'Header key should be lowered', + 'Content-Type': 'application/mIXED casING MAINTAINED', 'Other': 'LOWER THIS!' } modified_headers = { 'date': TODAY, + 'content-type': 'application/mIXED casING MAINTAINED', 'x-goog-foo': 'X-GOOG: MAINTAIN UPPERCASE!', 'x-goog-bar': 'Header key should be lowered', 'other': 'lower this!' diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/storage/test_s3.py new/apache-libcloud-2.8.1/libcloud/test/storage/test_s3.py --- old/apache-libcloud-2.8.0/libcloud/test/storage/test_s3.py 2019-12-21 22:34:46.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/test/storage/test_s3.py 2020-02-29 21:58:56.000000000 +0100 @@ -335,27 +335,6 @@ headers, httplib.responses[httplib.OK]) - def _foo_bar_container_foo_test_upload_INVALID_HASH1(self, method, url, - body, headers): - body = '' - headers = {} - headers['etag'] = '"foobar"' - # test_upload_object_invalid_hash1 - return (httplib.OK, - body, - headers, - httplib.responses[httplib.OK]) - - def _foo_bar_container_foo_test_upload_INVALID_HASH2(self, method, url, - body, headers): - # test_upload_object_invalid_hash2 - body = '' - headers = {'etag': '"hash343hhash89h932439jsaa89"'} - return (httplib.OK, - body, - headers, - httplib.responses[httplib.OK]) - def _foo_bar_container_foo_test_upload(self, method, url, body, headers): # test_upload_object_success body = '' @@ -772,12 +751,11 @@ def upload_file(self, object_name=None, content_type=None, request_path=None, request_method=None, headers=None, file_path=None, stream=None): - return {'response': make_response(200), + headers = {'etag': '"foobar"'} + return {'response': make_response(200, headers=headers), 'bytes_transferred': 1000, 'data_hash': 'hash343hhash89h932439jsaa89'} - self.mock_response_klass.type = 'INVALID_HASH1' - old_func = self.driver_type._upload_object self.driver_type._upload_object = upload_file file_path = os.path.abspath(__file__) @@ -802,12 +780,11 @@ def upload_file(self, object_name=None, content_type=None, request_path=None, request_method=None, headers=None, file_path=None, stream=None): - return {'response': make_response(200, headers={'etag': 'woopwoopwoop'}), + headers = {'etag': '"hash343hhash89h932439jsaa89"'} + return {'response': make_response(200, headers=headers), 'bytes_transferred': 1000, 'data_hash': '0cc175b9c0f1b6a831c399e269772661'} - self.mock_response_klass.type = 'INVALID_HASH2' - old_func = self.driver_type._upload_object self.driver_type._upload_object = upload_file @@ -827,6 +804,31 @@ finally: self.driver_type._upload_object = old_func + def test_upload_object_invalid_hash_kms_encryption(self): + # Hash check should be skipped when AWS KMS server side encryption is + # used + def upload_file(self, object_name=None, content_type=None, + request_path=None, request_method=None, + headers=None, file_path=None, stream=None): + headers = {'etag': 'blahblah', 'x-amz-server-side-encryption': 'aws:kms'} + return {'response': make_response(200, headers=headers), + 'bytes_transferred': 1000, + 'data_hash': 'hash343hhash89h932439jsaa81'} + + old_func = self.driver_type._upload_object + self.driver_type._upload_object = upload_file + file_path = os.path.abspath(__file__) + container = Container(name='foo_bar_container', extra={}, + driver=self.driver) + object_name = 'foo_test_upload' + try: + self.driver.upload_object(file_path=file_path, container=container, + object_name=object_name, + verify_hash=True) + finally: + self.driver_type._upload_object = old_func + + def test_upload_object_success(self): def upload_file(self, object_name=None, content_type=None, request_path=None, request_method=None, diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/test/test_logging_connection.py new/apache-libcloud-2.8.1/libcloud/test/test_logging_connection.py --- old/apache-libcloud-2.8.0/libcloud/test/test_logging_connection.py 2019-09-09 18:10:00.000000000 +0200 +++ new/apache-libcloud-2.8.1/libcloud/test/test_logging_connection.py 2020-02-29 22:33:43.000000000 +0100 @@ -13,20 +13,69 @@ # See the License for the specific language governing permissions and # limitations under the License. +import os import sys from io import StringIO import zlib import requests_mock +import mock + import libcloud +from libcloud.utils.py3 import PY3 from libcloud.test import unittest from libcloud.common.base import Connection from libcloud.http import LibcloudConnection from libcloud.utils.loggingconnection import LoggingConnection +EXPECTED_DATA_JSON = """ +HTTP/1.1 200 OK +Content-Type: application/json + +{"foo": "bar!"} +""".strip() + +EXPECTED_DATA_JSON_PRETTY = """ +HTTP/1.1 200 OK +Content-Type: application/json + +{ + "foo": "bar!" +} +""".strip() + +EXPECTED_DATA_XML = """ +HTTP/1.1 200 OK +Content-Type: text/xml + +<foo><bar /></foo> +""".strip() + +EXPECTED_DATA_XML_PRETTY_1 = """ +HTTP/1.1 200 OK +Content-Type: application/xml + +<foo><bar /></foo> +""".strip() + +EXPECTED_DATA_XML_PRETTY_2 = """ +HTTP/1.1 200 OK +Content-Type: application/xml + +<?xml version="1.0" ?> +<foo> + <bar/> +</foo> +""".strip() + class TestLoggingConnection(unittest.TestCase): + def setUp(self): + super(TestLoggingConnection, self).setUp() + self._reset_environ() + def tearDown(self): + super(TestLoggingConnection, self).tearDown() Connection.conn_class = LibcloudConnection def test_debug_method_uses_log_class(self): @@ -65,5 +114,67 @@ self.assertTrue(isinstance(conn.connection, LoggingConnection)) self.assertIn('-i -X GET', log) + def test_log_response_json_content_type(self): + conn = LoggingConnection(host='example.com', port=80) + + r = self._get_mock_response('application/json', '{"foo": "bar!"}') + result = conn._log_response(r).replace('\r', '') + self.assertTrue(EXPECTED_DATA_JSON in result) + + def test_log_response_xml_content_type(self): + conn = LoggingConnection(host='example.com', port=80) + + r = self._get_mock_response('text/xml', '<foo><bar /></foo>') + result = conn._log_response(r).replace('\r', '') + self.assertTrue(EXPECTED_DATA_XML in result) + + def test_log_response_with_pretty_print_json_content_type(self): + os.environ['LIBCLOUD_DEBUG_PRETTY_PRINT_RESPONSE'] = '1' + + conn = LoggingConnection(host='example.com', port=80) + + # body type is unicode + r = self._get_mock_response('application/json', u'{"foo": "bar!"}') + result = conn._log_response(r).replace('\r', '') + self.assertTrue(EXPECTED_DATA_JSON_PRETTY in result) + + # body type is bytes + if PY3: + data = bytes('{"foo": "bar!"}', 'utf-8') + else: + data = bytes('{"foo": "bar!"}') + r = self._get_mock_response('application/json', data) + result = conn._log_response(r).replace('\r', '') + self.assertTrue(EXPECTED_DATA_JSON_PRETTY in result) + + def test_log_response_with_pretty_print_xml_content_type(self): + os.environ['LIBCLOUD_DEBUG_PRETTY_PRINT_RESPONSE'] = '1' + + conn = LoggingConnection(host='example.com', port=80) + + r = self._get_mock_response('application/xml', '<foo><bar /></foo>') + result = conn._log_response(r).replace('\r', '') + self.assertTrue(EXPECTED_DATA_XML_PRETTY_1 in result or + EXPECTED_DATA_XML_PRETTY_2 in result) + + def _reset_environ(self): + if 'LIBCLOUD_DEBUG_PRETTY_PRINT_RESPONSE' in os.environ: + del os.environ['LIBCLOUD_DEBUG_PRETTY_PRINT_RESPONSE'] + + def _get_mock_response(self, content_type, body): + header = mock.Mock() + header.title.return_value = 'Content-Type' + header.lower.return_value = 'content-type' + + r = mock.Mock() + r.version = 11 + r.status = '200' + r.reason = 'OK' + r.getheaders.return_value = [(header, content_type)] + r.read.return_value = body + + return r + + if __name__ == '__main__': sys.exit(unittest.main()) diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/utils/loggingconnection.py new/apache-libcloud-2.8.1/libcloud/utils/loggingconnection.py --- old/apache-libcloud-2.8.0/libcloud/utils/loggingconnection.py 2019-12-18 22:26:03.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/utils/loggingconnection.py 2020-02-29 22:35:26.000000000 +0100 @@ -28,6 +28,7 @@ from libcloud.common.base import (LibcloudConnection, HttpLibResponseProxy) from libcloud.utils.py3 import _real_unicode as u +from libcloud.utils.py3 import ensure_text from libcloud.utils.misc import lowercase_keys @@ -68,12 +69,12 @@ if pretty_print and content_type == 'application/json': try: - body = json.loads(body.decode('utf-8')) + body = json.loads(ensure_text(body)) body = json.dumps(body, sort_keys=True, indent=4) except Exception: # Invalid JSON or server is lying about content-type pass - elif pretty_print and content_type == 'text/xml': + elif pretty_print and content_type in ['text/xml', 'application/xml']: try: elem = parseString(body.decode('utf-8')) body = elem.toprettyxml() @@ -81,7 +82,7 @@ # Invalid XML pass - ht += u(body) + ht += ensure_text(body) rv += ht rv += ("\n# -------- end %d:%d response ----------\n" diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/apache-libcloud-2.8.0/libcloud/utils/py3.py new/apache-libcloud-2.8.1/libcloud/utils/py3.py --- old/apache-libcloud-2.8.0/libcloud/utils/py3.py 2019-12-23 14:59:18.000000000 +0100 +++ new/apache-libcloud-2.8.1/libcloud/utils/py3.py 2020-02-29 22:35:52.000000000 +0100 @@ -112,6 +112,8 @@ else: raise TypeError("Invalid argument %r for ensure_string()" % (s,)) + ensure_text = ensure_string + def byte(n): # assume n is a Latin-1 string of length 1 return ord(n) @@ -187,6 +189,14 @@ b = bytes = ensure_string = str + def ensure_text(s): + if isinstance(s, _real_unicode): + return s + elif isinstance(s, (str, bytes)): + return s.decode('utf-8') + else: + raise TypeError("Invalid argument %r for ensure_text()" % (s,)) + def byte(n): return n ++++++ ec2_create_node.patch ++++++ --- /var/tmp/diff_new_pack.og3aAF/_old 2020-03-08 22:22:41.924024199 +0100 +++ /var/tmp/diff_new_pack.og3aAF/_new 2020-03-08 22:22:41.924024199 +0100 @@ -1,7 +1,7 @@ -Index: apache-libcloud-2.8.0/libcloud/compute/drivers/ec2.py +Index: apache-libcloud-2.8.1/libcloud/compute/drivers/ec2.py =================================================================== ---- apache-libcloud-2.8.0.orig/libcloud/compute/drivers/ec2.py -+++ apache-libcloud-2.8.0/libcloud/compute/drivers/ec2.py +--- apache-libcloud-2.8.1.orig/libcloud/compute/drivers/ec2.py ++++ apache-libcloud-2.8.1/libcloud/compute/drivers/ec2.py @@ -1892,12 +1892,18 @@ class BaseEC2NodeDriver(NodeDriver): for system shutdown. :type ex_terminate_on_shutdown: ``bool`` ++++++ gce_image_projects.patch ++++++ --- /var/tmp/diff_new_pack.og3aAF/_old 2020-03-08 22:22:41.932024204 +0100 +++ /var/tmp/diff_new_pack.og3aAF/_new 2020-03-08 22:22:41.932024204 +0100 @@ -1,7 +1,7 @@ -Index: apache-libcloud-2.8.0/libcloud/compute/drivers/gce.py +Index: apache-libcloud-2.8.1/libcloud/compute/drivers/gce.py =================================================================== ---- apache-libcloud-2.8.0.orig/libcloud/compute/drivers/gce.py -+++ apache-libcloud-2.8.0/libcloud/compute/drivers/gce.py +--- apache-libcloud-2.8.1.orig/libcloud/compute/drivers/gce.py ++++ apache-libcloud-2.8.1/libcloud/compute/drivers/gce.py @@ -1803,20 +1803,35 @@ class GCENodeDriver(NodeDriver): "rhel-8", ],