This is an automated email from the ASF dual-hosted git repository. tomaz pushed a commit to branch trunk in repository https://gitbox.apache.org/repos/asf/libcloud.git
commit 5691fd081f224395b74289937e46ef5afdb13b41 Author: Tomaz Muraus <[email protected]> AuthorDate: Thu Sep 3 12:59:00 2020 +0200 Add changelog entry. Closes #1485. --- CHANGES.rst | 494 ++---------------------------------------------------------- 1 file changed, 15 insertions(+), 479 deletions(-) diff --git a/CHANGES.rst b/CHANGES.rst index 8d2b638..be02df6 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -1,146 +1,22 @@ Changelog ========= -Changes in Apache Libcloud 3.2.0 --------------------------------- - -Common -~~~~~~ - -- ``libcloud.pricing.download_pricing_file`` function has been updated so it - tries to download latest ``pricing.json`` file from our public read-only S3 - bucket. - - We now run a daily job as part of our CI/CD which scrapes provider prices and - publishes the latest version of the ``pricing.json`` file to that bucket. - - For more information, please see - https://libcloud.readthedocs.io/en/latest/compute/pricing.html. +Changes in Apache Libcloud in development +----------------------------------------- Compute ~~~~~~~ -- [OpenStack] Add `ex_get_network()` to the OpenStack driver to make it - possible to retrieve a single network by using the ID. - - (GITHUB-1474) - [Sander Roosingh - @SanderRoosingh] - -- [OpenStack] Fix pagination in the ``list_images()`` method and make sure - method returns all the images, even if the result is spread across multiple - pages. - - (GITHUB-1467) - [Thomas Bechtold - @toabctl] - -- [GCE] Add script for scraping GCE pricing data and improve price addition in - ``_to_node_size`` method. - (GITHUB-1468) - [Eis D. Zaster - @Eis-D-Z] - -- [AWS EC2] Update script for scraping AWS EC2 pricing and update EC2 pricing - data. - (GITHUB-1469) - [Eis D. Zaster - @Eis-D-Z] - -- [Deployment] Add new ``wait_period`` argument to the ``deploy_node`` method - and default it to 5 seconds. +- [GCE] Fix ``ex_set_image_labels`` method using incorrect API path. + (GITHUB-1485) + [Poul Petersen - @petersen-poul] - This argument tells Libcloud how long to wait between each poll interval when - waiting for a node to come online and have IP address assigned to it. - - Previously this argument was not exposed to the end user and defaulted to 3 - seconds which means it would be quite easy to reach rate limits with some - providers when spinning up many instances concurrently using the same - credentials. - [Tomaz Muraus - @Kami] - -- [Azure ARM] Add script for scraping Azure ARM instance pricing data. - (GITHUB-1470) - [Eis D. Zaster - @Eis-D-Z] - -- Update ``deploy_node()`` method to try to re-connect to the server if we - receive "SSH connection not active" error when trying to run a deployment - step. - - In some scenarios, connection may get closed by the server for whatever - reason before finishing all the deployment steps and in this case only - re-connecting would help and result in a successful outcome. - [Tomaz Muraus - @Kami] - -- [Deployment] Make ``FileDeployment`` class much faster and more efficient - when working with large files or when running multiple ``FileDeployment`` - steps on a single node. - - This was achieved by implementing two changes on the ``ParamikoSSHClient`` - class: - - 1. ``put()`` method now tries to re-use the existing open SFTP connection - if one already exists instead of re-creating a new one for each - ``put()`` call. - 2. New ``putfo()`` method has been added to the ``ParamikoSSHClient`` class - which utilizes the underlying ``sftp.putfo()`` method. - - This method doesn't need to buffer the whole file content in memory and - also supports pipelining which makes uploads much faster and more - efficient for larger files. - - [Tomaz Muraus - @Kami] - -- [Deployment] Add ``__repr__()`` and ``__str__()`` methods to all the - Deployment classes. - [Tomaz Muraus - @Kami] - -- [Deployment] New ``keep_alive`` and ``use_compression`` arguments have been - added to the ``ParamikoSSHClient`` class constructor. - - Right now those are not exposed yet to the ``deploy_node()`` method. - [Tomaz Muraus - @Kami] - -- [Deployment] Update ``ParamikoSSHClient.put()`` method so it returns a - correct path when commands are being executed on a Windows machine. - - Also update related deployment classes so they correctly handle situation - when we are executing commands on a Windows server. - [Arthur Kamalov, Tomaz Muraus] - -- [Outscale] Add a new driver for the Outscale provider. Existing Outscale - driver utilizes the EC2 compatible API and this one utilizes native Outscale - API. - (GITHUB-1476) - [Tio Gobin - @tgn-outscale] - -- [KubeVirt] Add new methods for managing services which allows users to expose - ports for the VMs (``ex_list_services``, ``ex_create_service``, - ``ex_delete_service``). - (GITHUB-1478) - [Eis D. Zaster - @Eis-D-Z] - -Container -~~~~~~~~~ - -- [LXD] Add new methods for managing network and storage pool capabilities and - include other improvements in some of the existing methods. - (GITHUB-1477) - [Eis D. Zaster - @Eis-D-Z] - -Changes in Apache Libcloud 3.1.0 --------------------------------- +Changes in Apache Libcloud v2.8.3 +--------------------------------- Compute ~~~~~~~ -- [GCE] Add latest Ubuntu image families (Ubuntu 20.04) to the driver. - - (GITHUB-1449) - [Christopher Lambert - @XN137] - -- [DigitalOcean] Add ``location`` argument to the ``list_sizes()`` method. - - NOTE: Location filtering is performed on the client. - (GITHUB-1455, GITHUB-1456) - [RobertH1993] - - Fix ``deploy_node()`` so an exception is not thrown if any of the output (stdout / stderr) produced by the deployment script contains a non-valid utf-8 character. @@ -154,26 +30,6 @@ Compute (GITHUB-1459) [Tomaz Muraus - @Kami] -- Add new ``timeout`` argument to ``ScriptDeployment`` and - ``ScriptFileDeployment`` class constructor. - - With this argument, user can specify an optional run timeout for that - deployment step run. - (GITHUB-1445) - [Tomaz Muraus - @Kami] - -- [GiG G8] Fix retry functionality when creating port forwards and add support - for automatically refresing the JWT auth token inside the connection class if - it's about to expire in 60 seconds or less. - (GITHUB-1465) - [Jo De Boeck - @grimpy] - -- [Azure ARM] Update ``create_node`` so an exception is thrown if user passes - ``ex_use_managed_disks=False``, but doesn't provide a value for the - ``ex_storage_account`` argument. - (GITHUB-1448) - [@antoinebourayne] - Storage ~~~~~~~ @@ -185,157 +41,6 @@ Storage (GITHUB-1452, GITHUB-1457) [Tomaz Muraus] -DNS -~~~ - -- [CloudFlare] Update driver to include the whole error chain the thrown - exception message field. - - This makes various issues easier to debug since the whole error context is - included. - [Tomaz Muraus] - -- [Gandi Live, CloudFlare, GCE] Add support for managing ``CAA`` record types. - - When creating a ``CAA`` record, data field needs to be in the following - format: - - ``<flags> <tag> <domain name>`` - - For example: - - - ``0 issue caa.example.com`` - - ``0 issuewild caa.example.com`` - - ``0 iodef https://example.com/reports`` - - (GITHUB-1463, GITHUB-1464) - [Tomaz Muraus] - -- [Gandi Live] Don't throw if ``extra['rrset_ttl']`` argument is not passed - to the ``create_record`` method. - (GITHUB-1463) - [Tomaz Muraus] - -Other -~~~~~ - -- Update ``contrib/Dockerfile`` which can be used for running tests so - it only run tests with Python versions we support. This means dropping - support for Python < 3.5 and adding support for Python 3.7 and 3.8. - - Also update it to use a more recent Ubuntu version (18.04) and Python 3 - for running tox target. - (GITHUB-1451) - [Tomaz Muraus - @Kami, HuiFeng Tang - @99Kies] - -Changes in Apache Libcloud 3.0.0 --------------------------------- - -Common -~~~~~~ - -- Make sure ``auth_user_info`` variable on the OpenStack identify connection - class is populated when using auth version ``3.x_password`` and - ``3.x_oidc_access_token``. - - (GITHUB-1436) - [@lln-ijinus, Tomaz Muraus) - -- [OpenStack] Update OpenStack identity driver so a custom project can be - selected using ``domain_name`` keyword argument containing a project id. - - Previously this argument value could only contain a project name, now the - value will be checked against project name and id. - - (GITHUB-1439) - [Miguel Caballer - @micafer] - -Compute -~~~~~~~ - -- [GCE] Update ``create_node()`` method so it throws an exception if node - location can't be inferred and location is not specified by the user ( - either by passing ``datacenter`` constructor argument or by passing - ``location`` argument to the method). - - Reported by Kevin K. - @kbknapp. - (GITHUB-1443) - [Tomaz Muraus] - -- [GCE] Update ``ex_get_disktype`` method so it works if ``zone`` argument is - not set. - (GITHUB-1443) - [Tomaz Muraus] - -- [GiG G8] Add new driver for GiG G8 provider (https://gig.tech/). - (GITHUB-1437) - [Jo De Boeck - @grimpy] - -- Add new ``at_exit_func`` argument to ``deploy_node()`` method. With this - argument user can specify which function will be called before exiting - with the created node in question if the deploy process has been canceled - after the node has been created, but before the method has fully finished. - - This comes handy since it simplifies various cleanup scenarios. - (GITHUB-1445) - [Tomaz Muraus - @Kami] - -- [OpenStack] Fix auto assignment of volume device when using device name - ``auto`` in the ``attach_volume`` method. - (GITHUB-1444) - [Joshua Hesketh - @jhesketh] - -- [Kamatera] Add new driver for Kamatera provider (https://www.kamatera.com). - (GITHUB-1442) - [Ori Hoch - @OriHoch] - -Storage -~~~~~~~ - -- Add new ``download_object_range`` and ``download_object_range_as_stream`` - methods for downloading part of the object content (aka range downloads) to - the base storage API. - - Currently those methods are implemented for the local storage Azure Blobs, - CloudFiles, S3 and any other provider driver which is based on the S3 one - (such as Google Storage and DigitalOcean Spaces). - (GITHUB-1431) - [Tomaz Muraus] - -- Add type annotations for the base storage API. - (GITHUB-1410) - [Clemens Wolff - @c-w] - -- [Google Storage] Update the driver so it supports service account HMAC - credentials. - - There was a bug in the code where we used the user id length check to - determine the account type and that code check didn't take service - account HMAC credentials (which contain a longer string) into account. - - Reported by Patrick Mézard - pmezard. - (GITHUB-1437, GITHUB-1440) - [Yoan Tournade - @MonsieurV] - -DNS -~~~ - -- Add type annotations for the base DNS API. - (GITHUB-1434) - [Tomaz Muraus] - -Container -~~~~~~~~~ - -- [Kubernetes] Add support for the client certificate and static token based - authentication to the driver. - (GITHUB-1421) - [Tomaz Muraus] - -- Add type annotations for the base container API. - (GITHUB-1435) - [Tomaz Muraus] - Changes in Apache Libcloud v2.8.2 --------------------------------- @@ -497,179 +202,11 @@ DNS (GITHUB-1428, GITHUB-1429) [Tomaz Muraus] -Changes in Apache Libcloud 3.0.0-rc1 ------------------------------------- - -General -~~~~~~~ - -- This release drops support for Python versions older than 3.5.0. - - If you still need to use Libcloud with Python 2.7 or Python 3.4 you can do - that by using the latest release which still supported those Python versions - (Libcloud v2.8.0). - (GITHUB-1377) - [Tomaz Muraus] - -- Make sure unit tests now also pass on Windows. - (GITHUB-1396) - [Tomaz Muraus] - -Compute -~~~~~~~ - -- [VMware vSphere] vSphere driver relies on ``pysphere`` Python library which - doesn't support Python 3 so it has been removed. - - There is an unofficial ``pysphere`` fork which adds Python 3 support, but - it's out of date and not maintained (https://github.com/machalekj/pysphere/tree/2to3). - (GITHUB-1377) - [Tomaz Muraus] - -- [GCE] Fix ``ex_list_instancegroups`` method so it doesn't throw if ``zone`` - attribute is not present in the response. - - Reported by Kartik Subbarao (@kartiksubbarao) - (GITHUB-1346) - [Tomaz Muraus] - -- [AWS EC2] Add support for creating spot instances by utilizing new ``ex_spot`` - and optionally also ``ex_spot_max_price`` keyword argument in the - ``create_node`` method. - (GITHUB-1398) - [Peter Yu - @yukw777] - -- Fix some incorrect type annotations in the base compute API. - - Reported by @dpeschman. - (GITHUB-1413, GITHUB-1414) - [Tomaz Muraus] - -- [OpenStack] Fix error with getting node id in ``_to_floating_ip`` method - when region is not called ``nova``. - (GITHUB-1411, GITHUB-1412) - [Miguel Caballer - @micafer] - -- [KubeVirt] New KubeVirt driver with initial support for the k8s/KubeVirt - add-on. - (GITHUB-1394) - [Eis D. Zaster - @Eis-D-Z] - -Storage -~~~~~~~ - -- [AWS S3] Fix upload object code so uploaded data MD5 checksum check is not - performed at the end of the upload when AWS KMS server side encryption is - used. - - If AWS KMS server side object encryption is used, ETag header value in the - response doesn't contain data MD5 digest so we can't perform a checksum - check. - - Reported by Jonathan Harden - @jfharden. - (GITHUB-1401, GITHUB-1406) - [Tomaz Muraus - @Kami] - -- [Azure Blobs] Implement chunked upload in the Azure Storage driver. - - Previously, the maximum object size that could be uploaded with the - Azure Storage driver was capped at 100 MB: the maximum size that could - be uploaded in a single request to Azure. Chunked upload removes this - limitation and now enables uploading objects up to Azure's maximum block - blob size (~5 TB). The size of the chunks uploaded by the driver can be - configured via the ``LIBCLOUD_AZURE_UPLOAD_CHUNK_SIZE_MB`` environment - variable and defaults to 4 MB per chunk. Increasing this number trades-off - higher memory usage for a lower number of http requests executed by the - driver. - - Reported by @rvolykh. - (GITHUB-1399, GITHUB-1400) - [Clemens Wolff - @c-w] - -- [Azure Blobs] Drop support for uploading PageBlob objects via the Azure - Storage driver. - - Previously, both PageBlob and BlockBlob objects could be uploaded via the - ``upload_object`` and ``upload_object_via_stream`` methods by specifying the - ``ex_blob_type`` and ``ex_page_blob_size`` arguments. To simplify the API, - these options were removed and all uploaded objects are now of BlockBlob - type. Passing ``ex_blob_type`` or ``ex_page_blob_size`` will now raise a - ``ValueError``. - - (GITHUB-1400) - [Clemens Wolff - @c-w] - -- [Common] Add ``prefix`` argument to ``iterate_container_objects`` and - ``list_container_objects`` to support object-list filtering in all - StorageDriver implementations. - - A lot of the existing storage drivers already implemented the filtering - functionality via the ``ex_prefix`` extension argument so it was decided - to promote the argument to be part of the standard Libcloud storage API. - For any storage driver that doesn't natively implement filtering the results - list, a fall-back was implemented which filters the full object stream on - the client side. - - For backward compatibility reasons, the ``ex_prefix`` argument will still - be respected until a next major release. - (GITHUB-1397) - [Clemens Wolff - @c-w] - -- [Azure Blobs] Implement ``get_object_cdn_url`` for the Azure Storage driver. - - Leveraging Azure storage service shared access signatures, the Azure Storage - driver can now be used to generate temporary URLs that grant clients read - access to objects. The URLs expire after a certain period of time, either - configured via the ``ex_expiry`` argument or the - ``LIBCLOUD_AZURE_STORAGE_CDN_URL_EXPIRY_HOURS`` environment variable - (default: 24 hours). - - Reported by @rvolykh. - (GITHUB-1403, GITHUB-1408) - [Clemens Wolff - @c-w] - -- [Azure Blobs, Aliyun, Local, Ninefold, S3] Ensure upload headers are - respected. - - All storage drivers now pass the optional ``headers`` argument of - ``upload_object`` and ``upload_object_via_stream`` to the backend object - storage systems (previously the argument was silently ignored). - - (GITHUB-1410) - [Clemens Wolff - @c-w] - -- [AWS S3] Implement ``get_object_cdn_url`` for the AWS storage driver. - - The AWS storage driver can now be used to generate temporary URLs that - grant clients read access to objects. The URLs expire after a certain - period of time, either configured via the ``ex_expiry`` argument or the - ``LIBCLOUD_S3_STORAGE_CDN_URL_EXPIRY_HOURS`` environment variable - (default: 24 hours). - - Reported by @rvolykh. - (GITHUB-1403) - [Aaron Virshup - @avirshup] - -DNS -~~~ - -- [Gandi Live] Update the driver and make sure it matches the latest service / - API updates. - (GITHUB-1416) - [Ryan Lee - @zepheiryan] - -Container -~~~~~~~~~ - -- [LXD] Add new LXD driver. - (GITHUB-1395) - [Alexandros Giavaras - @pockerman] - Changes in Apache Libcloud v2.8.0 --------------------------------- Common -~~~~~~ +------ - Fix a regression with ``get_driver()`` method not working if ``provider`` argument value was a string (e.g. using ``get_driver('openstack')`` @@ -699,7 +236,7 @@ Common [Tomaz Muraus] Compute -~~~~~~~ +------- - [DigitalOcean] Fix ``attach_volume`` and ``detach_volume`` methods. Previously those two methods incorrectly passed volume id instead of @@ -756,7 +293,7 @@ Compute [Tomaz Muraus] Storage -~~~~~~~ +------- - [AWS S3] Make sure ``host`` driver constructor argument has priority over ``region`` argument. @@ -771,14 +308,14 @@ Changes in Apache Libcloud v2.7.0 --------------------------------- General -~~~~~~~ +------- - Test code with Python 3.8 and advertise that we also support Python 3.8. (GITHUB-1371, GITHUB-1374) [Tomaz Muraus] Common -~~~~~~ +------ - [OpenStack] Fix OpenStack project scoped token authentication. The driver constructors now accept ``ex_tenant_domain_id`` argument which tells @@ -787,7 +324,7 @@ Common [kshtsk] Compute -~~~~~~~ +------- - Introduce type annotations for the base compute API methods. This means you can now leverage mypy to type check (with some limitations) your code which @@ -1646,12 +1183,11 @@ Compute (LIBCLOUD-952, GITHUB-1124) [Mika Lackman] -- [UpCloud] Allow to define hostname and username. +- [UpCloud] Allow to define hostname and username (LIBCLOUD-951, LIBCLOUD-953, GITHUB-1123, GITHUB-1125) [Mika Lackman] -- [UpCloud] Add pricing information to list_sizes. - (LIBCLOUD-969, GITHUB-1152) +- [UpCloud] Add pricing information to list_sizes (LIBCLOUD-969, GITHUB-1152) [Mika Lackman] Storage
