Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-azure-storage-file-datalake
for openSUSE:Factory checked in at 2025-01-23 18:05:04
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-azure-storage-file-datalake (Old)
and
/work/SRC/openSUSE:Factory/.python-azure-storage-file-datalake.new.5589 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-azure-storage-file-datalake"
Thu Jan 23 18:05:04 2025 rev:27 rq:1239777 version:12.18.1
Changes:
--------
---
/work/SRC/openSUSE:Factory/python-azure-storage-file-datalake/python-azure-storage-file-datalake.changes
2024-11-14 16:11:11.389008788 +0100
+++
/work/SRC/openSUSE:Factory/.python-azure-storage-file-datalake.new.5589/python-azure-storage-file-datalake.changes
2025-01-23 18:06:18.154339293 +0100
@@ -1,0 +2,9 @@
+Thu Jan 23 09:26:53 UTC 2025 - John Paul Adrian Glaubitz
<[email protected]>
+
+- New upstream release
+ + Version 12.18.1
+ + For detailed information about changes see the
+ CHANGELOG.md file provided with this package
+- Update Requires from setup.py
+
+-------------------------------------------------------------------
Old:
----
azure_storage_file_datalake-12.18.0.tar.gz
New:
----
azure_storage_file_datalake-12.18.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-azure-storage-file-datalake.spec ++++++
--- /var/tmp/diff_new_pack.rV5Gjt/_old 2025-01-23 18:06:18.910370525 +0100
+++ /var/tmp/diff_new_pack.rV5Gjt/_new 2025-01-23 18:06:18.910370525 +0100
@@ -1,7 +1,7 @@
#
# spec file for package python-azure-storage-file-datalake
#
-# Copyright (c) 2024 SUSE LLC
+# Copyright (c) 2025 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -18,7 +18,7 @@
%{?sle15_python_module_pythons}
Name: python-azure-storage-file-datalake
-Version: 12.18.0
+Version: 12.18.1
Release: 0
Summary: Azure DataLake service client library for Python
License: MIT
@@ -36,7 +36,7 @@
Requires: python-azure-storage-nspkg >= 3.0.0
Requires: python-isodate >= 0.6.1
Requires: (python-azure-core >= 1.30.0 with python-azure-core < 2.0.0)
-Requires: (python-azure-storage-blob >= 12.24.0 with
python-azure-storage-blob < 13.0.0)
+Requires: (python-azure-storage-blob >= 12.24.1 with
python-azure-storage-blob < 13.0.0)
Requires: (python-typing_extensions >= 4.6.0)
Conflicts: python-azure-sdk <= 2.0.0
%if 0%{?sle_version} >= 150400
++++++ azure_storage_file_datalake-12.18.0.tar.gz ->
azure_storage_file_datalake-12.18.1.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/azure_storage_file_datalake-12.18.0/CHANGELOG.md
new/azure_storage_file_datalake-12.18.1/CHANGELOG.md
--- old/azure_storage_file_datalake-12.18.0/CHANGELOG.md 2024-11-13
18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/CHANGELOG.md 2025-01-22
20:21:24.000000000 +0100
@@ -1,5 +1,11 @@
# Release History
+## 12.18.1 (2025-01-22)
+
+### Bugs Fixed
+- Fixed an issue where custom transports may encounter `AttributeError` on
certain requests.
+- Fixed request handler to handle `None` value for `expires_on` keyword to
`set_file_expiry` API.
+
## 12.18.0 (2024-11-13)
### Features Added
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/GEN1_GEN2_MAPPING.md
new/azure_storage_file_datalake-12.18.1/GEN1_GEN2_MAPPING.md
--- old/azure_storage_file_datalake-12.18.0/GEN1_GEN2_MAPPING.md
2024-11-13 18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/GEN1_GEN2_MAPPING.md
2025-01-22 20:21:24.000000000 +0100
@@ -83,7 +83,7 @@
<td>Return last bytes of file</td>
</tr>
<tr>
-<td><a
href="https://docs.microsoft.com/python/API/azure-datalake-store/azure.datalake.store.core.azuredlfilesystem?view=azure-python#read-block-fn--offset--length--delimiter-none-"><strong>read_block</strong></a></td>
+<td><a
href="https://learn.microsoft.com/python/API/azure-datalake-store/azure.datalake.store.core.azuredlfilesystem?view=azure-python#read-block-fn--offset--length--delimiter-none-"><strong>read_block</strong></a></td>
<td>Read a block of bytes from an ADL file</td>
</tr>
<tr>
@@ -116,7 +116,7 @@
<td>Set the Access Control List (ACL) for a file or folder.</td>
</tr>
<tr>
-<td><a
href="https://docs.microsoft.com/python/API/azure-datalake-store/azure.datalake.store.core.azuredlfilesystem?view=azure-python#modify-acl-entries-path--acl-spec--recursive-false--number-of-sub-process-none-"><strong>modify_acl_entries</strong></a></td>
+<td><a
href="https://learn.microsoft.com/python/API/azure-datalake-store/azure.datalake.store.core.azuredlfilesystem?view=azure-python#modify-acl-entries-path--acl-spec--recursive-false--number-of-sub-process-none-"><strong>modify_acl_entries</strong></a></td>
<td>Modify existing Access Control List (ACL) entries on a file or folder. If
the entry does not exist it is added, otherwise it is updated based on the spec
passed in. No entries are removed by this process (unlike set_acl).</td>
</tr>
<tr>
@@ -132,7 +132,7 @@
<td rowspan="3">Probably users can achieve the same purpose by calling
set_access_control with related parameters.</td>
</tr>
<tr>
-<td><a
href="https://docs.microsoft.com/python/API/azure-datalake-store/azure.datalake.store.core.azuredlfilesystem?view=azure-python#remove-acl-path-"><strong>remove_acl</strong></a></td>
+<td><a
href="https://learn.microsoft.com/python/API/azure-datalake-store/azure.datalake.store.core.azuredlfilesystem?view=azure-python#remove-acl-path-"><strong>remove_acl</strong></a></td>
<td>Remove the entire, non default, ACL from the file or folder, including
unnamed entries. Default entries cannot be removed this way, please use
remove_default_acl for that. Note: this is not recursive, and applies only to
the file or folder specified.</td>
</tr>
<tr>
@@ -140,7 +140,7 @@
<td>Remove the entire default ACL from the folder. Default entries do not
exist on files, if a file is specified, this operation does nothing. Note: this
is not recursive, and applies only to the folder specified.</td>
</tr>
<tr>
-<td><a
href="https://docs.microsoft.com/python/API/azure-datalake-store/azure.datalake.store.core.azuredlfilesystem?view=azure-python#open-path--mode--rb---blocksize-33554432--delimiter-none-"><strong>open</strong></a></td>
+<td><a
href="https://learn.microsoft.com/python/API/azure-datalake-store/azure.datalake.store.core.azuredlfilesystem?view=azure-python#open-path--mode--rb---blocksize-33554432--delimiter-none-"><strong>open</strong></a></td>
<td>Open a file for reading or writing to.</td>
<td>N/A</td>
<td>There is no open file operation In ADLS Gen2. However users can do
operations to the file directly, eg. <strong>append_data, flush_data,
download_file</strong></td>
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/azure_storage_file_datalake-12.18.0/PKG-INFO
new/azure_storage_file_datalake-12.18.1/PKG-INFO
--- old/azure_storage_file_datalake-12.18.0/PKG-INFO 2024-11-13
19:15:29.487288700 +0100
+++ new/azure_storage_file_datalake-12.18.1/PKG-INFO 2025-01-22
20:40:07.976529100 +0100
@@ -1,13 +1,13 @@
Metadata-Version: 2.1
Name: azure-storage-file-datalake
-Version: 12.18.0
+Version: 12.18.1
Summary: Microsoft Azure File DataLake Storage Client Library for Python
Home-page: https://github.com/Azure/azure-sdk-for-python
Author: Microsoft Corporation
Author-email: [email protected]
License: MIT License
Keywords: azure,azure sdk
-Classifier: Development Status :: 5 - Production/Stable
+Classifier: Development Status :: 4 - Beta
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3
@@ -21,7 +21,7 @@
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: azure-core>=1.30.0
-Requires-Dist: azure-storage-blob>=12.24.0
+Requires-Dist: azure-storage-blob>=12.24.1
Requires-Dist: typing-extensions>=4.6.0
Requires-Dist: isodate>=0.6.1
Provides-Extra: aio
@@ -39,7 +39,7 @@
| [Package (PyPi)](https://pypi.org/project/azure-storage-file-datalake/)
| [Package (Conda)](https://anaconda.org/microsoft/azure-storage/)
| [API reference
documentation](https://aka.ms/azsdk-python-storage-filedatalake-ref)
-| [Product documentation](https://docs.microsoft.com/azure/storage/)
+| [Product documentation](https://learn.microsoft.com/azure/storage/)
|
[Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/storage/azure-storage-file-datalake/samples)
@@ -48,7 +48,7 @@
### Prerequisites
* Python 3.8 or later is required to use this package. For more details,
please read our page on [Azure SDK for Python version support
policy](https://github.com/Azure/azure-sdk-for-python/wiki/Azure-SDKs-Python-version-support-policy).
* You must have an [Azure subscription](https://azure.microsoft.com/free/) and
an
-[Azure storage
account](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account)
to use this package.
+[Azure storage
account](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account)
to use this package.
### Install the package
Install the Azure DataLake Storage client library for Python with
[pip](https://pypi.org/project/pip/):
@@ -59,9 +59,9 @@
### Create a storage account
If you wish to create a new storage account, you can use the
-[Azure
Portal](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-the-azure-portal),
-[Azure
PowerShell](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-powershell),
-or [Azure
CLI](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-azure-cli):
+[Azure
Portal](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-the-azure-portal),
+[Azure
PowerShell](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-powershell),
+or [Azure
CLI](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-azure-cli):
```bash
# Create a new resource group to hold the storage account -
@@ -315,7 +315,7 @@
### Additional documentation
Table for [ADLS Gen1 to ADLS Gen2 API
Mapping](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/storage/azure-storage-file-datalake/GEN1_GEN2_MAPPING.md)
-For more extensive REST documentation on Data Lake Storage Gen2, see the [Data
Lake Storage Gen2
documentation](https://docs.microsoft.com/rest/api/storageservices/datalakestoragegen2/filesystem)
on docs.microsoft.com.
+For more extensive REST documentation on Data Lake Storage Gen2, see the [Data
Lake Storage Gen2
documentation](https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/filesystem)
on learn.microsoft.com.
## Contributing
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/azure_storage_file_datalake-12.18.0/README.md
new/azure_storage_file_datalake-12.18.1/README.md
--- old/azure_storage_file_datalake-12.18.0/README.md 2024-11-13
18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/README.md 2025-01-22
20:21:24.000000000 +0100
@@ -10,7 +10,7 @@
| [Package (PyPi)](https://pypi.org/project/azure-storage-file-datalake/)
| [Package (Conda)](https://anaconda.org/microsoft/azure-storage/)
| [API reference
documentation](https://aka.ms/azsdk-python-storage-filedatalake-ref)
-| [Product documentation](https://docs.microsoft.com/azure/storage/)
+| [Product documentation](https://learn.microsoft.com/azure/storage/)
|
[Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/storage/azure-storage-file-datalake/samples)
@@ -19,7 +19,7 @@
### Prerequisites
* Python 3.8 or later is required to use this package. For more details,
please read our page on [Azure SDK for Python version support
policy](https://github.com/Azure/azure-sdk-for-python/wiki/Azure-SDKs-Python-version-support-policy).
* You must have an [Azure subscription](https://azure.microsoft.com/free/) and
an
-[Azure storage
account](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account)
to use this package.
+[Azure storage
account](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account)
to use this package.
### Install the package
Install the Azure DataLake Storage client library for Python with
[pip](https://pypi.org/project/pip/):
@@ -30,9 +30,9 @@
### Create a storage account
If you wish to create a new storage account, you can use the
-[Azure
Portal](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-the-azure-portal),
-[Azure
PowerShell](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-powershell),
-or [Azure
CLI](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-azure-cli):
+[Azure
Portal](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-the-azure-portal),
+[Azure
PowerShell](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-powershell),
+or [Azure
CLI](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-azure-cli):
```bash
# Create a new resource group to hold the storage account -
@@ -286,7 +286,7 @@
### Additional documentation
Table for [ADLS Gen1 to ADLS Gen2 API
Mapping](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/storage/azure-storage-file-datalake/GEN1_GEN2_MAPPING.md)
-For more extensive REST documentation on Data Lake Storage Gen2, see the [Data
Lake Storage Gen2
documentation](https://docs.microsoft.com/rest/api/storageservices/datalakestoragegen2/filesystem)
on docs.microsoft.com.
+For more extensive REST documentation on Data Lake Storage Gen2, see the [Data
Lake Storage Gen2
documentation](https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/filesystem)
on learn.microsoft.com.
## Contributing
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_data_lake_file_client.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_data_lake_file_client.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_data_lake_file_client.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_data_lake_file_client.py
2025-01-22 20:21:24.000000000 +0100
@@ -5,6 +5,7 @@
# --------------------------------------------------------------------------
# pylint: disable=docstring-keyword-should-match-keyword-only
+from datetime import datetime
from io import BytesIO
from typing import (
Any, AnyStr, AsyncIterable, Dict, IO, Iterable, Optional, Union,
@@ -32,7 +33,6 @@
if TYPE_CHECKING:
from azure.core.credentials import AzureNamedKeyCredential,
AzureSasCredential, TokenCredential
- from datetime import datetime
from ._models import ContentSettings
@@ -368,9 +368,9 @@
#other-client--per-operation-configuration>`_.
:rtype: None
"""
- try:
+ if isinstance(expires_on, datetime):
expires_on = convert_datetime_to_rfc1123(expires_on)
- except AttributeError:
+ elif expires_on is not None:
expires_on = str(expires_on)
self._datalake_client_for_blob_operation.path \
.set_expiry(expiry_options, expires_on=expires_on, **kwargs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_deserialize.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_deserialize.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_deserialize.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_deserialize.py
2025-01-22 20:21:24.000000000 +0100
@@ -164,7 +164,7 @@
error_dict = error_body.get('error', {})
elif not error_code:
_LOGGER.warning(
- 'Unexpected return type % from
ContentDecodePolicy.deserialize_from_http_generics.', type(error_body))
+ 'Unexpected return type %s from
ContentDecodePolicy.deserialize_from_http_generics.', type(error_body))
error_dict = {'message': str(error_body)}
# If we extracted from a Json or XML response
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/aio/operations/_file_system_operations.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/aio/operations/_file_system_operations.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/aio/operations/_file_system_operations.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/aio/operations/_file_system_operations.py
2025-01-22 20:21:24.000000000 +0100
@@ -79,7 +79,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param properties: Optional. User-defined properties to be stored with
the filesystem, in the
@@ -159,7 +159,7 @@
Set properties for the FileSystem. This operation supports
conditional HTTP requests. For
more information, see `Specifying Conditional Headers for Blob Service
Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -167,7 +167,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param properties: Optional. User-defined properties to be stored with
the filesystem, in the
@@ -255,7 +255,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:return: None or the result of cls(response)
@@ -331,7 +331,7 @@
directories within the filesystem, will fail with status code 404 (Not
Found) while the
filesystem is being deleted. This operation supports conditional HTTP
requests. For more
information, see `Specifying Conditional Headers for Blob Service
Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -339,7 +339,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param modified_access_conditions: Parameter group. Default value is
None.
@@ -425,7 +425,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param continuation: Optional. When deleting a directory, the number
of paths that are deleted
@@ -551,7 +551,7 @@
:type showonly: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/aio/operations/_path_operations.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/aio/operations/_path_operations.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/aio/operations/_path_operations.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/aio/operations/_path_operations.py
2025-01-22 20:21:24.000000000 +0100
@@ -102,7 +102,7 @@
destination already exists and has a lease the lease is broken. This
operation supports
conditional HTTP requests. For more information, see `Specifying
Conditional Headers for Blob
Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
To fail if the destination already exists, use a conditional request
with If-None-Match: "*".
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
@@ -111,7 +111,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param resource: Required only for Create File and Create Directory.
The value must be "file"
@@ -360,7 +360,7 @@
can only be appended to a file. Concurrent writes to the same file
using multiple clients are
not supported. This operation supports conditional HTTP requests. For
more information, see
`Specifying Conditional Headers for Blob Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param action: The action must be "append" to upload data to be
appended to a file, "flush" to
flush previously uploaded data to a file, "setProperties" to set the
properties of a file or
@@ -385,7 +385,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param max_records: Optional. Valid for "SetAccessControlRecursive"
operation. It specifies the
@@ -623,7 +623,7 @@
Create and manage a lease to restrict write and delete access to the
path. This operation
supports conditional HTTP requests. For more information, see
`Specifying Conditional Headers
for Blob Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param x_ms_lease_action: There are five lease actions: "acquire",
"break", "change", "renew",
and "release". Use "acquire" and specify the "x-ms-proposed-lease-id"
and "x-ms-lease-duration"
@@ -642,7 +642,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param x_ms_lease_break_period: The lease break period duration is
optional to break a lease,
@@ -763,7 +763,7 @@
Read the contents of a file. For read operations, range requests are
supported. This operation
supports conditional HTTP requests. For more information, see
`Specifying Conditional Headers
for Blob Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -771,7 +771,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param range: The HTTP Range request header specifies one or more byte
ranges of the resource
@@ -957,7 +957,7 @@
all system defined properties for a path. Get Access Control List
returns the access control
list for a path. This operation supports conditional HTTP requests.
For more information, see
`Specifying Conditional Headers for Blob Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -965,7 +965,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param action: Optional. If the value is "getStatus" only the system
defined properties for the
@@ -1089,7 +1089,7 @@
Delete the file or directory. This operation supports conditional HTTP
requests. For more
information, see `Specifying Conditional Headers for Blob Service
Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -1097,7 +1097,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param recursive: Required. Default value is None.
@@ -1213,7 +1213,7 @@
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param owner: Optional. The owner of the blob or directory. Default
value is None.
@@ -1337,7 +1337,7 @@
:type mode: str or
~azure.storage.filedatalake.models.PathSetAccessControlRecursiveMode
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param continuation: Optional. When deleting a directory, the number
of paths that are deleted
@@ -1452,7 +1452,7 @@
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param position: This parameter allows the caller to upload data in
parallel and control the
@@ -1662,7 +1662,7 @@
:type position: int
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param content_length: Required for "Append Data" and "Flush Data".
Must be 0 for "Flush
@@ -1820,7 +1820,7 @@
:type expiry_options: str or
~azure.storage.filedatalake.models.PathExpiryOptions
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
@@ -1898,7 +1898,7 @@
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param undelete_source: Only for hierarchical namespace enabled
accounts. Optional. The path of
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/aio/operations/_service_operations.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/aio/operations/_service_operations.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/aio/operations/_service_operations.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/aio/operations/_service_operations.py
2025-01-22 20:21:24.000000000 +0100
@@ -87,7 +87,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:return: An iterator like instance of either FileSystem or the result
of cls(response)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/operations/_file_system_operations.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/operations/_file_system_operations.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/operations/_file_system_operations.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/operations/_file_system_operations.py
2025-01-22 20:21:24.000000000 +0100
@@ -340,7 +340,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param properties: Optional. User-defined properties to be stored with
the filesystem, in the
@@ -420,7 +420,7 @@
Set properties for the FileSystem. This operation supports
conditional HTTP requests. For
more information, see `Specifying Conditional Headers for Blob Service
Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -428,7 +428,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param properties: Optional. User-defined properties to be stored with
the filesystem, in the
@@ -516,7 +516,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:return: None or the result of cls(response)
@@ -592,7 +592,7 @@
directories within the filesystem, will fail with status code 404 (Not
Found) while the
filesystem is being deleted. This operation supports conditional HTTP
requests. For more
information, see `Specifying Conditional Headers for Blob Service
Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -600,7 +600,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param modified_access_conditions: Parameter group. Default value is
None.
@@ -686,7 +686,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param continuation: Optional. When deleting a directory, the number
of paths that are deleted
@@ -812,7 +812,7 @@
:type showonly: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/operations/_path_operations.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/operations/_path_operations.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/operations/_path_operations.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/operations/_path_operations.py
2025-01-22 20:21:24.000000000 +0100
@@ -980,7 +980,7 @@
destination already exists and has a lease the lease is broken. This
operation supports
conditional HTTP requests. For more information, see `Specifying
Conditional Headers for Blob
Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
To fail if the destination already exists, use a conditional request
with If-None-Match: "*".
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
@@ -989,7 +989,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param resource: Required only for Create File and Create Directory.
The value must be "file"
@@ -1238,7 +1238,7 @@
can only be appended to a file. Concurrent writes to the same file
using multiple clients are
not supported. This operation supports conditional HTTP requests. For
more information, see
`Specifying Conditional Headers for Blob Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param action: The action must be "append" to upload data to be
appended to a file, "flush" to
flush previously uploaded data to a file, "setProperties" to set the
properties of a file or
@@ -1263,7 +1263,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param max_records: Optional. Valid for "SetAccessControlRecursive"
operation. It specifies the
@@ -1501,7 +1501,7 @@
Create and manage a lease to restrict write and delete access to the
path. This operation
supports conditional HTTP requests. For more information, see
`Specifying Conditional Headers
for Blob Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param x_ms_lease_action: There are five lease actions: "acquire",
"break", "change", "renew",
and "release". Use "acquire" and specify the "x-ms-proposed-lease-id"
and "x-ms-lease-duration"
@@ -1520,7 +1520,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param x_ms_lease_break_period: The lease break period duration is
optional to break a lease,
@@ -1641,7 +1641,7 @@
Read the contents of a file. For read operations, range requests are
supported. This operation
supports conditional HTTP requests. For more information, see
`Specifying Conditional Headers
for Blob Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -1649,7 +1649,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param range: The HTTP Range request header specifies one or more byte
ranges of the resource
@@ -1835,7 +1835,7 @@
all system defined properties for a path. Get Access Control List
returns the access control
list for a path. This operation supports conditional HTTP requests.
For more information, see
`Specifying Conditional Headers for Blob Service Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -1843,7 +1843,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param action: Optional. If the value is "getStatus" only the system
defined properties for the
@@ -1967,7 +1967,7 @@
Delete the file or directory. This operation supports conditional HTTP
requests. For more
information, see `Specifying Conditional Headers for Blob Service
Operations
-
<https://docs.microsoft.com/en-us/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
+
<https://learn.microsoft.com/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations>`_.
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
limit that is recorded in the analytics logs when storage analytics
logging is enabled. Default
@@ -1975,7 +1975,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param recursive: Required. Default value is None.
@@ -2091,7 +2091,7 @@
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param owner: Optional. The owner of the blob or directory. Default
value is None.
@@ -2215,7 +2215,7 @@
:type mode: str or
~azure.storage.filedatalake.models.PathSetAccessControlRecursiveMode
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param continuation: Optional. When deleting a directory, the number
of paths that are deleted
@@ -2330,7 +2330,7 @@
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param position: This parameter allows the caller to upload data in
parallel and control the
@@ -2540,7 +2540,7 @@
:type position: int
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param content_length: Required for "Append Data" and "Flush Data".
Must be 0 for "Flush
@@ -2698,7 +2698,7 @@
:type expiry_options: str or
~azure.storage.filedatalake.models.PathExpiryOptions
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param request_id_parameter: Provides a client-generated, opaque value
with a 1 KB character
@@ -2776,7 +2776,7 @@
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:param undelete_source: Only for hierarchical namespace enabled
accounts. Optional. The path of
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/operations/_service_operations.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/operations/_service_operations.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_generated/operations/_service_operations.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_generated/operations/_service_operations.py
2025-01-22 20:21:24.000000000 +0100
@@ -135,7 +135,7 @@
:type request_id_parameter: str
:param timeout: The timeout parameter is expressed in seconds. For
more information, see
:code:`<a
-
href="https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
+
href="https://learn.microsoft.com/rest/api/storageservices/fileservices/setting-timeouts-for-blob-service-operations">Setting
Timeouts for Blob Service Operations.</a>`. Default value is None.
:type timeout: int
:return: An iterator like instance of either FileSystem or the result
of cls(response)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/authentication.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/authentication.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/authentication.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/authentication.py
2025-01-22 20:21:24.000000000 +0100
@@ -121,7 +121,7 @@
"""
Represents a fatal error when attempting to sign a request.
In general, the cause of this exception is user error. For example, the
given account key is not valid.
- Please visit
https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account
for more info.
+ Please visit
https://learn.microsoft.com/azure/storage/common/storage-create-storage-account
for more info.
"""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/parser.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/parser.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/parser.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/parser.py
2025-01-22 20:21:24.000000000 +0100
@@ -4,26 +4,17 @@
# license information.
# --------------------------------------------------------------------------
-import sys
from datetime import datetime, timezone
from typing import Optional
EPOCH_AS_FILETIME = 116444736000000000 # January 1, 1970 as MS filetime
HUNDREDS_OF_NANOSECONDS = 10000000
-if sys.version_info < (3,):
- def _str(value):
- if isinstance(value, unicode): # pylint: disable=undefined-variable
- return value.encode('utf-8')
-
- return str(value)
-else:
- _str = str
-
def _to_utc_datetime(value: datetime) -> str:
return value.strftime('%Y-%m-%dT%H:%M:%SZ')
+
def _rfc_1123_to_datetime(rfc_1123: str) -> Optional[datetime]:
"""Converts an RFC 1123 date string to a UTC datetime.
@@ -36,6 +27,7 @@
return datetime.strptime(rfc_1123, "%a, %d %b %Y %H:%M:%S %Z")
+
def _filetime_to_datetime(filetime: str) -> Optional[datetime]:
"""Converts an MS filetime string to a UTC datetime. "0" indicates None.
If parsing MS Filetime fails, tries RFC 1123 as backup.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/policies_async.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/policies_async.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/policies_async.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/policies_async.py
2025-01-22 20:21:24.000000000 +0100
@@ -46,11 +46,11 @@
# retry if invalid content md5
if response.context.get('validate_content', False) and
response.http_response.headers.get('content-md5'):
try:
- await response.http_response.read() # Load the body in memory and
close the socket
+ await response.http_response.load_body() # Load the body in
memory and close the socket
except (StreamClosedError, StreamConsumedError):
pass
computed_md5 = response.http_request.headers.get('content-md5', None)
or \
-
encode_base64(StorageContentValidation.get_content_md5(response.http_response.content))
+
encode_base64(StorageContentValidation.get_content_md5(response.http_response.body()))
if response.http_response.headers['content-md5'] != computed_md5:
return True
return False
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/request_handlers.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/request_handlers.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/request_handlers.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/request_handlers.py
2025-01-22 20:21:24.000000000 +0100
@@ -185,7 +185,7 @@
# final line of body MUST have \r\n at the end, or it will not be properly
read by the service
batch_body.append(newline_bytes)
- return bytes().join(batch_body)
+ return b"".join(batch_body)
def _get_batch_request_delimiter(batch_id, is_prepend_dashes=False,
is_append_dashes=False):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/shared_access_signature.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/shared_access_signature.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_shared/shared_access_signature.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_shared/shared_access_signature.py
2025-01-22 20:21:24.000000000 +0100
@@ -7,7 +7,7 @@
from datetime import date
-from .parser import _str, _to_utc_datetime
+from .parser import _to_utc_datetime
from .constants import X_MS_VERSION
from . import sign_string, url_quote
@@ -187,7 +187,7 @@
def _add_query(self, name, val):
if val:
- self.query_dict[name] = _str(val) if val is not None else None
+ self.query_dict[name] = str(val) if val is not None else None
def add_encryption_scope(self, **kwargs):
self._add_query(QueryStringConstants.SIGNED_ENCRYPTION_SCOPE,
kwargs.pop('encryption_scope', None))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_version.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_version.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/_version.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/_version.py
2025-01-22 20:21:24.000000000 +0100
@@ -4,4 +4,4 @@
# license information.
# --------------------------------------------------------------------------
-VERSION = "12.18.0"
+VERSION = "12.18.1"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/aio/_data_lake_file_client_async.py
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/aio/_data_lake_file_client_async.py
---
old/azure_storage_file_datalake-12.18.0/azure/storage/filedatalake/aio/_data_lake_file_client_async.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure/storage/filedatalake/aio/_data_lake_file_client_async.py
2025-01-22 20:21:24.000000000 +0100
@@ -5,6 +5,7 @@
# --------------------------------------------------------------------------
# pylint: disable=invalid-overridden-method,
docstring-keyword-should-match-keyword-only
+from datetime import datetime
from typing import (
Any, AnyStr, AsyncIterable, Dict, IO, Iterable, Optional, Union,
TYPE_CHECKING)
@@ -23,7 +24,6 @@
if TYPE_CHECKING:
from azure.core.credentials import AzureNamedKeyCredential,
AzureSasCredential
from azure.core.credentials_async import AsyncTokenCredential
- from datetime import datetime
from .._models import ContentSettings
@@ -328,9 +328,9 @@
#other-client--per-operation-configuration>`_.
:rtype: None
"""
- try:
+ if isinstance(expires_on, datetime):
expires_on = convert_datetime_to_rfc1123(expires_on)
- except AttributeError:
+ elif expires_on is not None:
expires_on = str(expires_on)
await
self._datalake_client_for_blob_operation.path.set_expiry(expiry_options,
expires_on=expires_on,
**kwargs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure_storage_file_datalake.egg-info/PKG-INFO
new/azure_storage_file_datalake-12.18.1/azure_storage_file_datalake.egg-info/PKG-INFO
---
old/azure_storage_file_datalake-12.18.0/azure_storage_file_datalake.egg-info/PKG-INFO
2024-11-13 19:15:29.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure_storage_file_datalake.egg-info/PKG-INFO
2025-01-22 20:40:07.000000000 +0100
@@ -1,13 +1,13 @@
Metadata-Version: 2.1
Name: azure-storage-file-datalake
-Version: 12.18.0
+Version: 12.18.1
Summary: Microsoft Azure File DataLake Storage Client Library for Python
Home-page: https://github.com/Azure/azure-sdk-for-python
Author: Microsoft Corporation
Author-email: [email protected]
License: MIT License
Keywords: azure,azure sdk
-Classifier: Development Status :: 5 - Production/Stable
+Classifier: Development Status :: 4 - Beta
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3
@@ -21,7 +21,7 @@
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: azure-core>=1.30.0
-Requires-Dist: azure-storage-blob>=12.24.0
+Requires-Dist: azure-storage-blob>=12.24.1
Requires-Dist: typing-extensions>=4.6.0
Requires-Dist: isodate>=0.6.1
Provides-Extra: aio
@@ -39,7 +39,7 @@
| [Package (PyPi)](https://pypi.org/project/azure-storage-file-datalake/)
| [Package (Conda)](https://anaconda.org/microsoft/azure-storage/)
| [API reference
documentation](https://aka.ms/azsdk-python-storage-filedatalake-ref)
-| [Product documentation](https://docs.microsoft.com/azure/storage/)
+| [Product documentation](https://learn.microsoft.com/azure/storage/)
|
[Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/storage/azure-storage-file-datalake/samples)
@@ -48,7 +48,7 @@
### Prerequisites
* Python 3.8 or later is required to use this package. For more details,
please read our page on [Azure SDK for Python version support
policy](https://github.com/Azure/azure-sdk-for-python/wiki/Azure-SDKs-Python-version-support-policy).
* You must have an [Azure subscription](https://azure.microsoft.com/free/) and
an
-[Azure storage
account](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account)
to use this package.
+[Azure storage
account](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account)
to use this package.
### Install the package
Install the Azure DataLake Storage client library for Python with
[pip](https://pypi.org/project/pip/):
@@ -59,9 +59,9 @@
### Create a storage account
If you wish to create a new storage account, you can use the
-[Azure
Portal](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-the-azure-portal),
-[Azure
PowerShell](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-powershell),
-or [Azure
CLI](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-azure-cli):
+[Azure
Portal](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-the-azure-portal),
+[Azure
PowerShell](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-powershell),
+or [Azure
CLI](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-azure-cli):
```bash
# Create a new resource group to hold the storage account -
@@ -315,7 +315,7 @@
### Additional documentation
Table for [ADLS Gen1 to ADLS Gen2 API
Mapping](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/storage/azure-storage-file-datalake/GEN1_GEN2_MAPPING.md)
-For more extensive REST documentation on Data Lake Storage Gen2, see the [Data
Lake Storage Gen2
documentation](https://docs.microsoft.com/rest/api/storageservices/datalakestoragegen2/filesystem)
on docs.microsoft.com.
+For more extensive REST documentation on Data Lake Storage Gen2, see the [Data
Lake Storage Gen2
documentation](https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/filesystem)
on learn.microsoft.com.
## Contributing
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure_storage_file_datalake.egg-info/SOURCES.txt
new/azure_storage_file_datalake-12.18.1/azure_storage_file_datalake.egg-info/SOURCES.txt
---
old/azure_storage_file_datalake-12.18.0/azure_storage_file_datalake.egg-info/SOURCES.txt
2024-11-13 19:15:29.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure_storage_file_datalake.egg-info/SOURCES.txt
2025-01-22 20:40:07.000000000 +0100
@@ -105,6 +105,7 @@
tests/test_file_async.py
tests/test_file_system.py
tests/test_file_system_async.py
+tests/test_helpers_async.py
tests/test_large_file.py
tests/test_large_file_async.py
tests/test_quick_query.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/azure_storage_file_datalake.egg-info/requires.txt
new/azure_storage_file_datalake-12.18.1/azure_storage_file_datalake.egg-info/requires.txt
---
old/azure_storage_file_datalake-12.18.0/azure_storage_file_datalake.egg-info/requires.txt
2024-11-13 19:15:29.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/azure_storage_file_datalake.egg-info/requires.txt
2025-01-22 20:40:07.000000000 +0100
@@ -1,5 +1,5 @@
azure-core>=1.30.0
-azure-storage-blob>=12.24.0
+azure-storage-blob>=12.24.1
typing-extensions>=4.6.0
isodate>=0.6.1
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/samples/README.md
new/azure_storage_file_datalake-12.18.1/samples/README.md
--- old/azure_storage_file_datalake-12.18.0/samples/README.md 2024-11-13
18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/samples/README.md 2025-01-22
20:21:24.000000000 +0100
@@ -40,7 +40,7 @@
## Prerequisites
* Python 3.6 later is required to use this package
* You must have an [Azure subscription](https://azure.microsoft.com/free/) and
an
-[Azure storage
account](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account)
to run these samples.
+[Azure storage
account](https://learn.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account)
to run these samples.
## Setup
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/azure_storage_file_datalake-12.18.0/setup.py
new/azure_storage_file_datalake-12.18.1/setup.py
--- old/azure_storage_file_datalake-12.18.0/setup.py 2024-11-13
18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/setup.py 2025-01-22
20:21:24.000000000 +0100
@@ -57,7 +57,7 @@
url='https://github.com/Azure/azure-sdk-for-python',
keywords="azure, azure sdk",
classifiers=[
- 'Development Status :: 5 - Production/Stable',
+ 'Development Status :: 4 - Beta',
'Programming Language :: Python',
'Programming Language :: Python :: 3 :: Only',
'Programming Language :: Python :: 3',
@@ -78,7 +78,7 @@
python_requires=">=3.8",
install_requires=[
"azure-core>=1.30.0",
- "azure-storage-blob>=12.24.0",
+ "azure-storage-blob>=12.24.1",
"typing-extensions>=4.6.0",
"isodate>=0.6.1"
],
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/settings/testcase.py
new/azure_storage_file_datalake-12.18.1/tests/settings/testcase.py
--- old/azure_storage_file_datalake-12.18.0/tests/settings/testcase.py
2024-11-13 18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/tests/settings/testcase.py
2025-01-22 20:21:24.000000000 +0100
@@ -4,8 +4,6 @@
# Licensed under the MIT License. See License.txt in the project root for
# license information.
# --------------------------------------------------------------------------
-from __future__ import division
-
import functools
import os.path
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/test_datalake_service_client.py
new/azure_storage_file_datalake-12.18.1/tests/test_datalake_service_client.py
---
old/azure_storage_file_datalake-12.18.0/tests/test_datalake_service_client.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/tests/test_datalake_service_client.py
2025-01-22 20:21:24.000000000 +0100
@@ -41,7 +41,7 @@
self._assert_logging_equal(prop['analytics_logging'],
AnalyticsLogging())
self._assert_metrics_equal(prop['hour_metrics'], Metrics())
self._assert_metrics_equal(prop['minute_metrics'], Metrics())
- self._assert_cors_equal(prop['cors'], list())
+ self._assert_cors_equal(prop['cors'], [])
def _assert_logging_equal(self, log1, log2):
if log1 is None or log2 is None:
@@ -122,7 +122,7 @@
analytics_logging=AnalyticsLogging(),
hour_metrics=Metrics(),
minute_metrics=Metrics(),
- cors=list(),
+ cors=[],
target_version='2014-02-14'
)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/test_datalake_service_client_async.py
new/azure_storage_file_datalake-12.18.1/tests/test_datalake_service_client_async.py
---
old/azure_storage_file_datalake-12.18.0/tests/test_datalake_service_client_async.py
2024-11-13 18:57:16.000000000 +0100
+++
new/azure_storage_file_datalake-12.18.1/tests/test_datalake_service_client_async.py
2025-01-22 20:21:24.000000000 +0100
@@ -46,7 +46,7 @@
self._assert_logging_equal(prop['analytics_logging'],
AnalyticsLogging())
self._assert_metrics_equal(prop['hour_metrics'], Metrics())
self._assert_metrics_equal(prop['minute_metrics'], Metrics())
- self._assert_cors_equal(prop['cors'], list())
+ self._assert_cors_equal(prop['cors'], [])
def _assert_logging_equal(self, log1, log2):
if log1 is None or log2 is None:
@@ -127,7 +127,7 @@
analytics_logging=AnalyticsLogging(),
hour_metrics=Metrics(),
minute_metrics=Metrics(),
- cors=list(),
+ cors=[],
target_version='2014-02-14'
)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/test_directory.py
new/azure_storage_file_datalake-12.18.1/tests/test_directory.py
--- old/azure_storage_file_datalake-12.18.0/tests/test_directory.py
2024-11-13 18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/tests/test_directory.py
2025-01-22 20:21:24.000000000 +0100
@@ -521,7 +521,7 @@
num_file_per_sub_dir = 5
self._create_sub_directory_and_files(directory_client, num_sub_dirs,
num_file_per_sub_dir)
- response_list = list()
+ response_list = []
def callback(response):
response_list.append(response)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/test_directory_async.py
new/azure_storage_file_datalake-12.18.1/tests/test_directory_async.py
--- old/azure_storage_file_datalake-12.18.0/tests/test_directory_async.py
2024-11-13 18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/tests/test_directory_async.py
2025-01-22 20:21:24.000000000 +0100
@@ -522,7 +522,7 @@
num_file_per_sub_dir = 5
await self._create_sub_directory_and_files(directory_client,
num_sub_dirs, num_file_per_sub_dir)
- response_list = list()
+ response_list = []
def callback(response):
response_list.append(response)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/test_file.py
new/azure_storage_file_datalake-12.18.1/tests/test_file.py
--- old/azure_storage_file_datalake-12.18.0/tests/test_file.py 2024-11-13
18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/tests/test_file.py 2025-01-22
20:21:24.000000000 +0100
@@ -1188,13 +1188,18 @@
content_disposition='inline')
expiry_time = self.get_datetime_variable(variables, 'expiry_time',
datetime.utcnow() + timedelta(hours=1))
file_client = directory_client.create_file("newfile",
metadata=metadata, content_settings=content_settings)
+
+ # Act / Assert
file_client.set_file_expiry("Absolute", expires_on=expiry_time)
properties = file_client.get_file_properties()
-
- # Assert
assert properties
assert properties.expiry_time is not None
+ file_client.set_file_expiry("NeverExpire")
+ properties = file_client.get_file_properties()
+ assert properties
+ assert properties.expiry_time is None
+
return variables
@DataLakePreparer()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/test_file_async.py
new/azure_storage_file_datalake-12.18.1/tests/test_file_async.py
--- old/azure_storage_file_datalake-12.18.0/tests/test_file_async.py
2024-11-13 18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/tests/test_file_async.py
2025-01-22 20:21:24.000000000 +0100
@@ -35,6 +35,7 @@
from devtools_testutils.aio import recorded_by_proxy_async
from devtools_testutils.storage.aio import AsyncStorageRecordedTestCase
from settings.testcase import DataLakePreparer
+from test_helpers_async import AsyncStream, MockStorageTransport
#
------------------------------------------------------------------------------
TEST_DIRECTORY_PREFIX = 'directory'
@@ -1137,13 +1138,18 @@
content_disposition='inline')
expiry_time = self.get_datetime_variable(variables, 'expiry_time',
datetime.utcnow() + timedelta(hours=1))
file_client = await directory_client.create_file("newfile",
metadata=metadata, content_settings=content_settings)
+
+ # Act / Assert
await file_client.set_file_expiry("Absolute", expires_on=expiry_time)
properties = await file_client.get_file_properties()
-
- # Assert
assert properties
assert properties.expiry_time is not None
+ await file_client.set_file_expiry("NeverExpire")
+ properties = await file_client.get_file_properties()
+ assert properties
+ assert properties.expiry_time is None
+
return variables
@DataLakePreparer()
@@ -1531,6 +1537,62 @@
await fc.get_file_properties()
await fc.upload_data(data, overwrite=True)
+ @DataLakePreparer()
+ async def test_mock_transport_no_content_validation(self, **kwargs):
+ datalake_storage_account_name =
kwargs.pop("datalake_storage_account_name")
+ datalake_storage_account_key =
kwargs.pop("datalake_storage_account_key")
+
+ transport = MockStorageTransport()
+ file_client = DataLakeFileClient(
+ self.account_url(datalake_storage_account_name, 'dfs'),
+ "filesystem/",
+ "dir/file.txt",
+ credential=datalake_storage_account_key,
+ transport=transport,
+ retry_total=0
+ )
+
+ data = await file_client.download_file()
+ assert data is not None
+
+ props = await file_client.get_file_properties()
+ assert props is not None
+
+ data = b"Hello Async World!"
+ stream = AsyncStream(data)
+ resp = await file_client.upload_data(stream, overwrite=True)
+ assert resp is not None
+
+ file_data = await (await file_client.download_file()).read()
+ assert file_data == b"Hello Async World!" # data is fixed by mock
transport
+
+ resp = await file_client.delete_file()
+ assert resp is not None
+
+ @DataLakePreparer()
+ async def test_mock_transport_with_content_validation(self, **kwargs):
+ datalake_storage_account_name =
kwargs.pop("datalake_storage_account_name")
+ datalake_storage_account_key =
kwargs.pop("datalake_storage_account_key")
+
+ await self._setUp(datalake_storage_account_name,
datalake_storage_account_key)
+
+ transport = MockStorageTransport()
+ file_client = DataLakeFileClient(
+ self.account_url(datalake_storage_account_name, 'dfs'),
+ "filesystem/",
+ "dir/file.txt",
+ credential=datalake_storage_account_key,
+ transport=transport,
+ retry_total=0
+ )
+
+ data = b"Hello Async World!"
+ stream = AsyncStream(data)
+ resp = await file_client.upload_data(stream, overwrite=True,
validate_content=True)
+ assert resp is not None
+
+ file_data = await (await
file_client.download_file(validate_content=True)).read()
+ assert file_data == b"Hello Async World!" # data is fixed by mock
transport
#
------------------------------------------------------------------------------
if __name__ == '__main__':
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/test_file_system_async.py
new/azure_storage_file_datalake-12.18.1/tests/test_file_system_async.py
--- old/azure_storage_file_datalake-12.18.0/tests/test_file_system_async.py
2024-11-13 18:57:16.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/tests/test_file_system_async.py
2025-01-22 20:21:24.000000000 +0100
@@ -1144,7 +1144,7 @@
)
sas_directory_client = FileSystemClient(self.dsc.url, file_system_name,
credential=token)
- paths = list()
+ paths = []
async for path in sas_directory_client.get_paths():
paths.append(path)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/azure_storage_file_datalake-12.18.0/tests/test_helpers_async.py
new/azure_storage_file_datalake-12.18.1/tests/test_helpers_async.py
--- old/azure_storage_file_datalake-12.18.0/tests/test_helpers_async.py
1970-01-01 01:00:00.000000000 +0100
+++ new/azure_storage_file_datalake-12.18.1/tests/test_helpers_async.py
2025-01-22 20:21:24.000000000 +0100
@@ -0,0 +1,170 @@
+# -------------------------------------------------------------------------
+# Copyright (c) Microsoft Corporation. All rights reserved.
+# Licensed under the MIT License. See License.txt in the project root for
+# license information.
+# --------------------------------------------------------------------------
+from typing import Any, Dict
+from urllib.parse import urlparse
+
+from azure.core.pipeline.transport import AsyncHttpTransport
+from azure.core.rest import HttpRequest
+from azure.core.rest._aiohttp import RestAioHttpTransportResponse
+from aiohttp import ClientResponse
+
+
+class AsyncStream:
+ def __init__(self, data: bytes):
+ self._data = data
+ self._offset = 0
+
+ def __len__(self) -> int:
+ return len(self._data)
+
+ async def read(self, size: int = -1) -> bytes:
+ if size == -1:
+ return self._data
+
+ start = self._offset
+ end = self._offset + size
+ data = self._data[start:end]
+ self._offset += len(data)
+
+ return data
+
+
+class MockAioHttpClientResponse(ClientResponse):
+ def __init__(
+ self, url: str,
+ body_bytes: bytes,
+ headers: Dict[str, Any],
+ status: int = 200,
+ reason: str = "OK"
+ ) -> None:
+ super(MockAioHttpClientResponse).__init__()
+ self._url = url
+ self._body = body_bytes
+ self._headers = headers
+ self._cache = {}
+ self._loop = None
+ self.status = status
+ self.reason = reason
+
+
+class MockStorageTransport(AsyncHttpTransport):
+ """
+ This transport returns legacy http response objects from azure core and is
+ intended only to test our backwards compatibility support.
+ """
+ async def send(self, request: HttpRequest, **kwargs: Any) ->
RestAioHttpTransportResponse:
+ if request.method == 'GET':
+ # download_file
+ headers = {
+ "Content-Type": "application/octet-stream",
+ "Content-Range": "bytes 0-17/18",
+ "Content-Length": "18",
+ }
+
+ if "x-ms-range-get-content-md5" in request.headers:
+ headers["Content-MD5"] = "I3pVbaOCUTom+G9F9uKFoA=="
+
+ rest_response = RestAioHttpTransportResponse(
+ request=request,
+ internal_response=MockAioHttpClientResponse(
+ request.url,
+ b"Hello Async World!",
+ headers,
+ ),
+ decompress=False
+ )
+ elif request.method == 'HEAD':
+ # get_file_properties
+ rest_response = RestAioHttpTransportResponse(
+ request=request,
+ internal_response=MockAioHttpClientResponse(
+ request.url,
+ b"",
+ {
+ "Content-Type": "application/octet-stream",
+ "Content-Length": "1024",
+ },
+ ),
+ decompress=False
+ )
+ elif request.method == 'PUT':
+ # upload_data
+ rest_response = RestAioHttpTransportResponse(
+ request=request,
+ internal_response=MockAioHttpClientResponse(
+ request.url,
+ b"",
+ {
+ "Content-Length": "0",
+ },
+ 201,
+ "Created"
+ ),
+ decompress=False
+ )
+ elif request.method == 'PATCH':
+ # upload_data_chunks
+ parsed = urlparse(request.url)
+ if "action=flush" in parsed.query:
+ rest_response = RestAioHttpTransportResponse(
+ request=request,
+ internal_response=MockAioHttpClientResponse(
+ request.url,
+ b"",
+ {
+ "Content-Length": "0",
+ },
+ 200,
+ "OK"
+ ),
+ decompress=False
+ )
+ else:
+ rest_response = RestAioHttpTransportResponse(
+ request=request,
+ internal_response=MockAioHttpClientResponse(
+ request.url,
+ b"",
+ {
+ "Content-Length": "0",
+ },
+ 202,
+ "Accepted"
+ ),
+ decompress=False
+ )
+ elif request.method == 'DELETE':
+ # delete_file
+ rest_response = RestAioHttpTransportResponse(
+ request=request,
+ internal_response=MockAioHttpClientResponse(
+ request.url,
+ b"",
+ {
+ "Content-Length": "0",
+ },
+ 202,
+ "Accepted"
+ ),
+ decompress=False
+ )
+ else:
+ raise ValueError("The request is not accepted as part of
MockStorageTransport.")
+
+ await rest_response.read()
+ return rest_response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, *args):
+ pass
+
+ async def open(self):
+ pass
+
+ async def close(self):
+ pass