Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-h5py for openSUSE:Factory 
checked in at 2023-08-23 14:58:47
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-h5py (Old)
 and      /work/SRC/openSUSE:Factory/.python-h5py.new.1766 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-h5py"

Wed Aug 23 14:58:47 2023 rev:26 rq:1105464 version:3.9.0

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-h5py/python-h5py.changes  2023-02-16 
16:57:22.376334183 +0100
+++ /work/SRC/openSUSE:Factory/.python-h5py.new.1766/python-h5py.changes        
2023-08-23 14:59:55.366237773 +0200
@@ -1,0 +2,47 @@
+Tue Aug 22 18:22:43 UTC 2023 - Ben Greiner <c...@bnavigator.de>
+
+- Update to 3.9.0
+  * This version of h5py requires Python 3.8 or above.
+  ## New features
+  * New out argument to read_direct_chunk() to allow passing the
+    output buffer (PR 2232).
+  * The objects from Dataset.asstr() and Dataset.astype() now
+    implement the __array__() method (PR 2269). This speeds up
+    access for functions that support it, such as np.asarray().
+  * Validate key types when creating groups and attributes, giving
+    better error messages when invalid types are used (PR 2266).
+  ## Deprecations & removals
+  * Using Dataset.astype() as a context manager has been removed,
+    after being deprecated in h5py 3.6. Read data by slicing the
+    returned object instead: dset.astype('f4')[:].
+  * Exposing HDF5 functions
+  * H5Pget_elink_acc_flags & H5Pset_elink_acc_flags as
+    h5py.h5p.PropLAID.get_elink_acc_flags() &
+    h5py.h5p.PropLAID.set_elink_acc_flags(): access the external
+    link file access traversal flags in a link access property list
+    (PR 2244).
+  * H5Zregister as h5py.h5z.register_filter(): register an HDF5
+    filter (PR 2229).
+  ## Bug fixes
+  * Group.__contains__ and Group.get now use the default link
+    access property list systematically (PR 2244).
+  * Removed various calls to the deprecated numpy.product function
+    (PR 2242 & PR 2273).
+  * Fix the IPython tab-completion integration in IPython 8.12 (PR
+    2256).
+  * Replacing attributes with AttributeManager.create() now deletes
+    the old attributes before creating the new one, rather than
+    using a temporary name and renaming the new attribute (PR
+    2274). This should avoid some confusing bugs affecting
+    attributes. However, failures creating an attribute are less
+    likely to leave an existing attribute of the same name in
+    place. To change an attribute value without changing its shape
+    or dtype, use modify() instead.
+  ## Building h5py
+  * When building with Parallel HDF5 support, the version of mpi4py
+    used on various Python versions is increased to 3.1.1, fixing
+    building with a newer setuptools (PR 2225).
+  * Some fixes towards compatibility with the upcoming Cython 3 (PR
+    2247).
+
+-------------------------------------------------------------------

Old:
----
  h5py-3.8.0.tar.gz

New:
----
  h5py-3.9.0.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-h5py.spec ++++++
--- /var/tmp/diff_new_pack.D8EuFm/_old  2023-08-23 14:59:56.026238953 +0200
+++ /var/tmp/diff_new_pack.D8EuFm/_new  2023-08-23 14:59:56.034238967 +0200
@@ -60,16 +60,16 @@
 %endif
 # /SECTION MPI DEFINITIONS
 Name:           %{pname}%{?my_suffix}
-Version:        3.8.0
+Version:        3.9.0
 Release:        0
 Summary:        Python interface to the Hierarchical Data Format library
 License:        BSD-3-Clause
 Group:          Development/Libraries/Python
 URL:            https://github.com/h5py/h5py
 Source:         
https://files.pythonhosted.org/packages/source/h/h5py/h5py-%{version}.tar.gz
-BuildRequires:  %{python_module Cython >= 0.29}
-BuildRequires:  %{python_module devel >= 3.7}
-BuildRequires:  %{python_module numpy-devel >= 1.14.5}
+BuildRequires:  %{python_module Cython >= 0.29 with %python-Cython < 1}
+BuildRequires:  %{python_module devel >= 3.8}
+BuildRequires:  %{python_module numpy-devel >= 1.17.3}
 BuildRequires:  %{python_module pip}
 BuildRequires:  %{python_module pkgconfig}
 BuildRequires:  %{python_module pytest}
@@ -80,12 +80,13 @@
 BuildRequires:  python-rpm-macros
 %requires_eq    hdf5%{?my_suffix}
 %requires_eq    libhdf5%{?my_suffix}
-Requires:       python-numpy >= 1.14.5
+Requires:       python-numpy >= 1.17.3
 %if %{with mpi}
 BuildRequires:  %{mpi_flavor}%{mpi_vers}-devel
-BuildRequires:  %{python_module mpi4py >= 3.0.2}
+BuildRequires:  %{python_module mpi4py >= 3.1.1 if %python-base < 3.11}
+BuildRequires:  %{python_module mpi4py >= 3.1.4 if %python-base >= 3.11}
 BuildRequires:  %{python_module pytest-mpi}
-Requires:       python-mpi4py >= 3.0.2
+Requires:       python-mpi4py >= 3.1.1
 %endif
 %python_subpackages
 
@@ -119,7 +120,12 @@
 %python_expand %fdupes %{buildroot}%{my_sitearch_in_expand}/h5py/
 
 %check
-%{python_expand # Offset test fails on 32-bit
+donttest="dummytest"
+%ifarch %{ix86} %{arm}
+# overflow
+donttest="test_float_round_tripping or test_register_filter"
+%endif
+%{python_expand #
 %if %{with mpi}
 source %{my_bindir}/mpivars.sh
 %endif
@@ -127,11 +133,7 @@
 export PYTHONPATH=%{buildroot}%{my_sitearch_in_expand}
 export PYTHONDONTWRITEBYTECODE=1
 pytest-%{$python_bin_suffix} %{buildroot}%{my_sitearch_in_expand}/h5py/ \
-%ifarch %{ix86}
-        %{?with_mpi:-k 'not test_float_round_tripping' -m 'not 
mpi_skip'}%{!?with_mpi:-k 'not (TestMPI or test_float_round_tripping)'}
-%else
-        %{?with_mpi:-m 'not mpi_skip'}%{!?with_mpi:-k 'not TestMPI'}
-%endif
+%{?with_mpi:-k "not ($donttest)" -m 'not mpi_skip'}%{!?with_mpi:-k "not 
(TestMPI or $donttest)"}
 }
 
 %files %{python_files}

++++++ h5py-3.8.0.tar.gz -> h5py-3.9.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/.gitignore new/h5py-3.9.0/.gitignore
--- old/h5py-3.8.0/.gitignore   2022-12-21 17:13:47.000000000 +0100
+++ new/h5py-3.9.0/.gitignore   1970-01-01 01:00:00.000000000 +0100
@@ -1,40 +0,0 @@
-h5py/h5*.c
-h5py/utils.c
-h5py/_conv.c
-h5py/_proxy.c
-h5py/_objects.c
-h5py/_errors.c
-h5py/_selector.c
-h5py/config.pxi
-h5py/_hdf5.*
-h5py/*.dll
-h5py/*.so
-*.hdf5
-h5config.json
-h5py/defs.*
-build/
-*.pyc
-*.pyd
-dist/
-MANIFEST
-.DS_Store
-/docs/_build
-/docs_api/_build
-/.tox
-/h5py.egg-info
-/*.egg
-/.asv
-.eggs/
-*.so
-*~
-*.swp
-.pytest_cache
-.coverage
-.coverage_dir
-coverage.xml
-.vscode/
-.mypy_cache/
-
-# Rever
-rever/
-.idea/
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/PKG-INFO new/h5py-3.9.0/PKG-INFO
--- old/h5py-3.8.0/PKG-INFO     2023-01-23 11:21:36.018553500 +0100
+++ new/h5py-3.9.0/PKG-INFO     2023-06-20 11:06:12.781311300 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 2.1
 Name: h5py
-Version: 3.8.0
+Version: 3.9.0
 Summary: Read and write HDF5 files from Python
 Author-email: Andrew Collette <andrew.colle...@gmail.com>
 Maintainer-email: Thomas Kluyver <tho...@kluyver.me.uk>, Thomas A Caswell 
<tcasw...@bnl.gov>
@@ -26,7 +26,7 @@
 Classifier: Topic :: Scientific/Engineering
 Classifier: Topic :: Database
 Classifier: Topic :: Software Development :: Libraries :: Python Modules
-Requires-Python: >=3.7
+Requires-Python: >=3.8
 Description-Content-Type: text/x-rst
 License-File: LICENSE
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/README.rst new/h5py-3.9.0/README.rst
--- old/h5py-3.8.0/README.rst   2023-01-20 17:24:30.000000000 +0100
+++ new/h5py-3.9.0/README.rst   2023-06-07 13:39:16.000000000 +0200
@@ -20,7 +20,7 @@
 Installation
 ------------
 
-Pre-build `h5py` can either be installed via your Python Distribution (e.g.
+Pre-built `h5py` can either be installed via your Python Distribution (e.g.
 `Continuum Anaconda`_, `Enthought Canopy`_) or from `PyPI`_ via `pip`_.
 `h5py` is also distributed in many Linux Distributions (e.g. Ubuntu, Fedora),
 and in the macOS package managers `Homebrew <https://brew.sh/>`_,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/docs/conf.py new/h5py-3.9.0/docs/conf.py
--- old/h5py-3.8.0/docs/conf.py 2023-01-23 11:18:46.000000000 +0100
+++ new/h5py-3.9.0/docs/conf.py 2023-06-20 10:56:46.000000000 +0200
@@ -62,7 +62,7 @@
 # built documents.
 #
 # The full version, including alpha/beta/rc tags.
-release = '3.8.0'
+release = '3.9.0'
 # The short X.Y version.
 version = '.'.join(release.split('.')[:2])
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/docs/high/dataset.rst 
new/h5py-3.9.0/docs/high/dataset.rst
--- old/h5py-3.8.0/docs/high/dataset.rst        2023-01-04 18:26:33.000000000 
+0100
+++ new/h5py-3.9.0/docs/high/dataset.rst        2023-04-26 17:08:31.000000000 
+0200
@@ -503,12 +503,8 @@
             >>> out.dtype
             dtype('int16')
 
-        .. versionchanged:: 3.0
-           Allowed reading through the wrapper object. In earlier versions,
-           :meth:`astype` had to be used as a context manager:
-
-               >>> with dset.astype('int16'):
-               ...     out = dset[:]
+        .. versionchanged:: 3.9
+           :meth:`astype` can no longer be used as a context manager.
 
     .. method:: asstr(encoding=None, errors='strict')
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/docs/high/file.rst 
new/h5py-3.9.0/docs/high/file.rst
--- old/h5py-3.8.0/docs/high/file.rst   2023-01-04 18:26:33.000000000 +0100
+++ new/h5py-3.9.0/docs/high/file.rst   2023-02-17 11:43:33.000000000 +0100
@@ -115,6 +115,13 @@
         The argument values must be ``bytes`` objects. All three arguments are
         required to activate AWS authentication.
 
+        .. note::
+           Pre-built h5py packages on PyPI do not include this S3 support. If
+           you want this feature, you could use packages from conda-forge, or
+           :ref:`build h5py from source <source_install>` against an HDF5 build
+           with S3 support. Alternatively, use the :ref:`file-like object
+           <file_fileobj>` support with a package like s3fs.
+
 
 .. _file_fileobj:
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/docs/whatsnew/3.9.rst 
new/h5py-3.9.0/docs/whatsnew/3.9.rst
--- old/h5py-3.8.0/docs/whatsnew/3.9.rst        1970-01-01 01:00:00.000000000 
+0100
+++ new/h5py-3.9.0/docs/whatsnew/3.9.rst        2023-06-20 10:56:46.000000000 
+0200
@@ -0,0 +1,56 @@
+What's new in h5py 3.9
+======================
+
+This version of h5py requires Python 3.8 or above.
+
+New features
+------------
+
+* New ``out`` argument to :meth:`~h5py.h5d.DatasetID.read_direct_chunk` to 
allow passing
+  the output buffer (:pr:`2232`).
+* The objects from :meth:`.Dataset.asstr` and :meth:`.Dataset.astype` now
+  implement the ``__array__()`` method (:pr:`2269`).
+  This speeds up access for functions that support it, such as 
``np.asarray()``.
+* Validate key types when creating groups and attributes, giving better error
+  messages when invalid types are used (:pr:`2266`).
+
+Deprecations & removals
+-----------------------
+
+* Using :meth:`.Dataset.astype` as a context manager has been removed, after
+  being deprecated in h5py 3.6. Read data by slicing the returned object 
instead:
+  ``dset.astype('f4')[:]``.
+
+Exposing HDF5 functions
+-----------------------
+
+* ``H5Pget_elink_acc_flags`` & ``H5Pset_elink_acc_flags`` as
+  :meth:`h5py.h5p.PropLAID.get_elink_acc_flags` & 
:meth:`h5py.h5p.PropLAID.set_elink_acc_flags`:
+  access the external link file access traversal flags in a link access 
property
+  list (:pr:`2244`).
+* ``H5Zregister`` as :func:`h5py.h5z.register_filter`: register an HDF5 filter
+  (:pr:`2229`).
+
+Bug fixes
+---------
+
+* ``Group.__contains__`` and ``Group.get`` now use the default link access
+  property list systematically (:pr:`2244`).
+* Removed various calls to the deprecated ``numpy.product`` function 
(:pr:`2242`
+  & :pr:`2273`).
+* Fix the IPython tab-completion integration in IPython 8.12 (:pr:2256`).
+* Replacing attributes with :meth:`.AttributeManager.create` now deletes the 
old
+  attributes before creating the new one, rather than using a temporary name
+  and renaming the new attribute (:pr:`2274`). This should avoid some confusing
+  bugs affecting attributes. However, failures creating an attribute are less
+  likely to leave an existing attribute of the same name in place. To change an
+  attribute value without changing its shape or dtype, use
+  :meth:`~.AttributeManager.modify` instead.
+
+Building h5py
+-------------
+
+* When building with :ref:`parallel` support, the version of mpi4py used on
+  various Python versions is increased to 3.1.1, fixing building with a newer
+  setuptools (:pr:`2225`).
+* Some fixes towards compatibility with the upcoming Cython 3 (:pr:`2247`).
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/docs/whatsnew/index.rst 
new/h5py-3.9.0/docs/whatsnew/index.rst
--- old/h5py-3.8.0/docs/whatsnew/index.rst      2023-01-23 11:18:46.000000000 
+0100
+++ new/h5py-3.9.0/docs/whatsnew/index.rst      2023-06-20 10:56:46.000000000 
+0200
@@ -8,6 +8,7 @@
 
 .. toctree::
 
+    3.9
     3.8
     3.7
     3.6
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/_errors.pyx 
new/h5py-3.9.0/h5py/_errors.pyx
--- old/h5py-3.8.0/h5py/_errors.pyx     2023-01-04 18:26:33.000000000 +0100
+++ new/h5py-3.9.0/h5py/_errors.pyx     2023-04-26 17:08:31.000000000 +0200
@@ -94,7 +94,7 @@
     H5E_error_t err
     int n
 
-cdef herr_t walk_cb(unsigned int n, const H5E_error_t *desc, void *e) nogil:
+cdef herr_t walk_cb(unsigned int n, const H5E_error_t *desc, void *e) nogil 
noexcept:
 
     cdef err_data_t *ee = <err_data_t*>e
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/_hl/attrs.py 
new/h5py-3.9.0/h5py/_hl/attrs.py
--- old/h5py-3.8.0/h5py/_hl/attrs.py    2023-01-04 18:26:33.000000000 +0100
+++ new/h5py-3.9.0/h5py/_hl/attrs.py    2023-06-16 12:09:33.000000000 +0200
@@ -122,6 +122,7 @@
             Data type of the attribute.  Overrides data.dtype if both
             are given.
         """
+        name = self._e(name)
 
         with phil:
             # First, make sure we have a NumPy array.  We leave the data type
@@ -166,7 +167,7 @@
             # is compatible, and reshape if needed.
             else:
 
-                if shape is not None and numpy.product(shape, 
dtype=numpy.ulonglong) != numpy.product(data.shape, dtype=numpy.ulonglong):
+                if shape is not None and product(shape) != product(data.shape):
                     raise ValueError("Shape of new attribute conflicts with 
shape of data")
 
                 if shape != data.shape:
@@ -189,35 +190,24 @@
             else:
                 space = h5s.create_simple(shape)
 
-            # This mess exists because you can't overwrite attributes in HDF5.
-            # So we write to a temporary attribute first, and then rename.
-            # see issue 1385
-            # if track_order is enabled new attributes (which exceed the
-            # max_compact range, 8 is default) cannot be created as temporary
-            # attributes with subsequent rename, doing that would trigger
-            # the error discussed in the above issue
-            attr_exists = False
-            if h5a.exists(self._id, self._e(name)):
-                attr_exists = True
-                tempname = uuid.uuid4().hex
-            else:
-                tempname = name
+            # For a long time, h5py would create attributes with a random name
+            # and then rename them, imitating how you can atomically replace
+            # a file in a filesystem. But HDF5 does not offer atomic 
replacement
+            # (you have to delete the existing attribute first), and renaming
+            # exposes some bugs - see https://github.com/h5py/h5py/issues/1385
+            # So we've gone back to the simpler delete & recreate model.
+            if h5a.exists(self._id, name):
+                h5a.delete(self._id, name)
 
-            attr = h5a.create(self._id, self._e(tempname), htype, space)
+            attr = h5a.create(self._id, name, htype, space)
             try:
                 if not isinstance(data, Empty):
                     attr.write(data, mtype=htype2)
-                if attr_exists:
-                    # Rename temp attribute to proper name
-                    # No atomic rename in HDF5 :(
-                    h5a.delete(self._id, self._e(name))
-                    h5a.rename(self._id, self._e(tempname), self._e(name))
             except:
                 attr.close()
-                h5a.delete(self._id, self._e(tempname))
+                h5a.delete(self._id, name)
                 raise
-            finally:
-                attr.close()
+            attr.close()
 
     def modify(self, name, value):
         """ Change the value of an attribute while preserving its type.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/_hl/base.py 
new/h5py-3.9.0/h5py/_hl/base.py
--- old/h5py-3.8.0/h5py/_hl/base.py     2022-12-21 17:13:47.000000000 +0100
+++ new/h5py-3.9.0/h5py/_hl/base.py     2023-06-07 13:39:16.000000000 +0200
@@ -195,13 +195,15 @@
 
         if isinstance(name, bytes):
             coding = h5t.CSET_ASCII
-        else:
+        elif isinstance(name, str):
             try:
                 name = name.encode('ascii')
                 coding = h5t.CSET_ASCII
             except UnicodeEncodeError:
                 name = name.encode('utf8')
                 coding = h5t.CSET_UTF8
+        else:
+            raise TypeError(f"A name should be string or bytes, not 
{type(name)}")
 
         if lcpl:
             return name, get_lcpl(coding)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/_hl/dataset.py 
new/h5py-3.9.0/h5py/_hl/dataset.py
--- old/h5py-3.8.0/h5py/_hl/dataset.py  2023-01-04 18:26:33.000000000 +0100
+++ new/h5py-3.9.0/h5py/_hl/dataset.py  2023-06-16 12:09:33.000000000 +0200
@@ -13,15 +13,14 @@
 
 import posixpath as pp
 import sys
-from warnings import warn
-
-from threading import local
 
 import numpy
 
 from .. import h5, h5s, h5t, h5r, h5d, h5p, h5fd, h5ds, _selector
-from ..h5py_warnings import H5pyDeprecationWarning
-from .base import HLObject, phil, with_phil, Empty, cached_property, 
find_item_type
+from .base import (
+    array_for_new_object, cached_property, Empty, find_item_type, HLObject,
+    phil, product, with_phil,
+)
 from . import filters
 from . import selections as sel
 from . import selections2 as sel2
@@ -44,8 +43,7 @@
 
     # Convert data to a C-contiguous ndarray
     if data is not None and not isinstance(data, Empty):
-        from . import base
-        data = base.array_for_new_object(data, specified_dtype=dtype)
+        data = array_for_new_object(data, specified_dtype=dtype)
 
     # Validate shape
     if shape is None:
@@ -56,7 +54,7 @@
         shape = data.shape
     else:
         shape = (shape,) if isinstance(shape, int) else tuple(shape)
-        if data is not None and (numpy.product(shape, dtype=numpy.ulonglong) 
!= numpy.product(data.shape, dtype=numpy.ulonglong)):
+        if data is not None and (product(shape) != product(data.shape)):
             raise ValueError("Shape tuple is incompatible with data")
 
     if isinstance(maxshape, int):
@@ -208,20 +206,6 @@
     def __getitem__(self, args):
         return self._dset.__getitem__(args, new_dtype=self._dtype)
 
-    def __enter__(self):
-        # pylint: disable=protected-access
-        warn(
-            "Using astype() as a context manager is deprecated. "
-            "Slice the returned object instead, like: 
ds.astype(np.int32)[:10]",
-            category=H5pyDeprecationWarning, stacklevel=2,
-        )
-        self._dset._local.astype = self._dtype
-        return self
-
-    def __exit__(self, *args):
-        # pylint: disable=protected-access
-        self._dset._local.astype = None
-
     def __len__(self):
         """ Get the length of the underlying dataset
 
@@ -229,6 +213,12 @@
         """
         return len(self._dset)
 
+    def __array__(self, dtype=None):
+        data = self[:]
+        if dtype is not None:
+            data = data.astype(dtype)
+        return data
+
 
 class AsStrWrapper:
     """Wrapper to decode strings on reading the dataset"""
@@ -261,6 +251,11 @@
         """
         return len(self._dset)
 
+    def __array__(self):
+        return numpy.array([
+            b.decode(self.encoding, self.errors) for b in self._dset
+        ], dtype=object).reshape(self._dset.shape)
+
 
 class FieldsWrapper:
     """Wrapper to extract named fields from a dataset with a struct dtype"""
@@ -494,7 +489,7 @@
         if self._is_empty:
             size = None
         else:
-            size = numpy.prod(self.shape, dtype=numpy.intp)
+            size = product(self.shape)
 
         # If the file is read-only, cache the size to speed-up future uses.
         # This cache is invalidated by .refresh() when using SWMR.
@@ -651,8 +646,6 @@
         self._filters = filters.get_filters(self._dcpl)
         self._readonly = readonly
         self._cache_props = {}
-        self._local = local()
-        self._local.astype = None
 
     def resize(self, size, axis=None):
         """ Resize the dataset, or the specified axis.
@@ -760,9 +753,6 @@
         """
         args = args if isinstance(args, tuple) else (args,)
 
-        if new_dtype is None:
-            new_dtype = getattr(self._local, 'astype', None)
-
         if self._fast_read_ok and (new_dtype is None):
             try:
                 return self._fast_reader.read(args)
@@ -885,7 +875,8 @@
                 if val.ndim > 1:
                     tmp = numpy.empty(shape=val.shape[:-1], dtype=object)
                     tmp.ravel()[:] = [i for i in val.reshape(
-                        (numpy.product(val.shape[:-1], dtype=numpy.ulonglong), 
val.shape[-1]))]
+                        (product(val.shape[:-1]), val.shape[-1])
+                    )]
                 else:
                     tmp = numpy.array([None], dtype=object)
                     tmp[0] = val
@@ -994,8 +985,7 @@
         if mshape == () and selection.array_shape != ():
             if self.dtype.subdtype is not None:
                 raise TypeError("Scalar broadcasting is not supported for 
array dtypes")
-            if self.chunks and (numpy.prod(self.chunks, dtype=numpy.float64) >=
-                                numpy.prod(selection.array_shape, 
dtype=numpy.float64)):
+            if self.chunks and (product(self.chunks) >= 
product(selection.array_shape)):
                 val2 = numpy.empty(selection.array_shape, dtype=val.dtype)
             else:
                 val2 = numpy.empty(selection.array_shape[-1], dtype=val.dtype)
@@ -1067,7 +1057,7 @@
         arr = numpy.zeros(self.shape, dtype=self.dtype if dtype is None else 
dtype)
 
         # Special case for (0,)*-shape datasets
-        if numpy.product(self.shape, dtype=numpy.ulonglong) == 0:
+        if self.size == 0:
             return arr
 
         self.read_direct(arr)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/_hl/filters.py 
new/h5py-3.9.0/h5py/_hl/filters.py
--- old/h5py-3.8.0/h5py/_hl/filters.py  2022-12-21 17:13:47.000000000 +0100
+++ new/h5py-3.9.0/h5py/_hl/filters.py  2023-04-26 17:08:31.000000000 +0200
@@ -41,6 +41,7 @@
 import operator
 
 import numpy as np
+from .base import product
 from .compat import filename_encode
 from .. import h5z, h5p, h5d, h5f
 
@@ -358,7 +359,7 @@
 
     # Determine the optimal chunk size in bytes using a PyTables expression.
     # This is kept as a float.
-    dset_size = np.product(chunks)*typesize
+    dset_size = product(chunks)*typesize
     target_size = CHUNK_BASE * (2**np.log10(dset_size/(1024.*1024)))
 
     if target_size > CHUNK_MAX:
@@ -373,14 +374,14 @@
         # 1b. We're within 50% of the target chunk size, AND
         #  2. The chunk is smaller than the maximum chunk size
 
-        chunk_bytes = np.product(chunks)*typesize
+        chunk_bytes = product(chunks)*typesize
 
         if (chunk_bytes < target_size or \
          abs(chunk_bytes-target_size)/target_size < 0.5) and \
          chunk_bytes < CHUNK_MAX:
             break
 
-        if np.product(chunks) == 1:
+        if product(chunks) == 1:
             break  # Element size larger than CHUNK_MAX
 
         chunks[idx%ndims] = np.ceil(chunks[idx%ndims] / 2.0)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/_hl/group.py 
new/h5py-3.9.0/h5py/_hl/group.py
--- old/h5py-3.8.0/h5py/_hl/group.py    2023-01-04 18:26:33.000000000 +0100
+++ new/h5py-3.9.0/h5py/_hl/group.py    2023-04-26 17:08:31.000000000 +0200
@@ -405,7 +405,7 @@
                 return default
 
             elif getclass and not getlink:
-                typecode = h5o.get_info(self.id, self._e(name)).type
+                typecode = h5o.get_info(self.id, self._e(name), 
lapl=self._lapl).type
 
                 try:
                     return {h5o.TYPE_GROUP: Group,
@@ -415,18 +415,18 @@
                     raise TypeError("Unknown object type")
 
             elif getlink:
-                typecode = self.id.links.get_info(self._e(name)).type
+                typecode = self.id.links.get_info(self._e(name), 
lapl=self._lapl).type
 
                 if typecode == h5l.TYPE_SOFT:
                     if getclass:
                         return SoftLink
-                    linkbytes = self.id.links.get_val(self._e(name))
+                    linkbytes = self.id.links.get_val(self._e(name), 
lapl=self._lapl)
                     return SoftLink(self._d(linkbytes))
 
                 elif typecode == h5l.TYPE_EXTERNAL:
                     if getclass:
                         return ExternalLink
-                    filebytes, linkbytes = self.id.links.get_val(self._e(name))
+                    filebytes, linkbytes = 
self.id.links.get_val(self._e(name), lapl=self._lapl)
                     return ExternalLink(
                         filename_decode(filebytes), self._d(linkbytes)
                     )
@@ -508,6 +508,10 @@
     @with_phil
     def __contains__(self, name):
         """ Test if a member name exists """
+        if hasattr(h5g, "_path_valid"):
+            if not self.id:
+                return False
+            return h5g._path_valid(self.id, self._e(name), self._lapl)
         return self._e(name) in self.id
 
     def copy(self, source, dest, name=None,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/_hl/selections.py 
new/h5py-3.9.0/h5py/_hl/selections.py
--- old/h5py-3.8.0/h5py/_hl/selections.py       2022-12-21 17:13:47.000000000 
+0100
+++ new/h5py-3.9.0/h5py/_hl/selections.py       2023-04-26 17:08:31.000000000 
+0200
@@ -427,7 +427,7 @@
 
     shape = tuple(get_n_axis(sid, x) for x in range(rank))
 
-    if np.product(shape) != N:
+    if product(shape) != N:
         # This means multiple hyperslab selections are in effect,
         # so we fall back to a 1D shape
         return (N,)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/api_functions.txt 
new/h5py-3.9.0/h5py/api_functions.txt
--- old/h5py-3.8.0/h5py/api_functions.txt       2023-01-10 18:06:37.000000000 
+0100
+++ new/h5py-3.9.0/h5py/api_functions.txt       2023-04-26 17:08:31.000000000 
+0200
@@ -391,6 +391,8 @@
   ssize_t   H5Pget_elink_prefix(hid_t plist_id, char *prefix, size_t size)
   hid_t     H5Pget_elink_fapl(hid_t lapl_id)
   herr_t    H5Pset_elink_fapl(hid_t lapl_id, hid_t fapl_id)
+  herr_t    H5Pget_elink_acc_flags(hid_t lapl_id, unsigned int *flags)
+  herr_t    H5Pset_elink_acc_flags(hid_t lapl_id, unsigned int flags)
 
   herr_t    H5Pset_create_intermediate_group(hid_t plist_id, unsigned 
crt_intmd)
   herr_t    H5Pget_create_intermediate_group(hid_t plist_id, unsigned 
*crt_intmd)
@@ -586,6 +588,7 @@
 
   htri_t    H5Zfilter_avail(H5Z_filter_t id_)
   herr_t    H5Zget_filter_info(H5Z_filter_t filter_, unsigned int 
*filter_config_flags)
+  herr_t    H5Zregister(const void *cls)
   herr_t    H5Zunregister(H5Z_filter_t id_)
 
 hdf5_hl:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/api_types_hdf5.pxd 
new/h5py-3.9.0/h5py/api_types_hdf5.pxd
--- old/h5py-3.8.0/h5py/api_types_hdf5.pxd      2023-01-10 18:06:37.000000000 
+0100
+++ new/h5py-3.9.0/h5py/api_types_hdf5.pxd      2023-04-26 17:08:31.000000000 
+0200
@@ -776,6 +776,7 @@
 
   ctypedef int H5Z_filter_t
 
+  int H5Z_CLASS_T_VERS
   int H5Z_FILTER_ERROR
   int H5Z_FILTER_NONE
   int H5Z_FILTER_ALL
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/h5d.pyx new/h5py-3.9.0/h5py/h5d.pyx
--- old/h5py-3.8.0/h5py/h5d.pyx 2023-01-10 18:06:37.000000000 +0100
+++ new/h5py-3.9.0/h5py/h5d.pyx 2023-04-26 17:08:31.000000000 +0200
@@ -16,6 +16,7 @@
 # Compile-time imports
 
 from collections import namedtuple
+cimport cython
 from ._objects cimport pdefault
 from numpy cimport ndarray, import_array, PyArray_DATA
 from .utils cimport  check_numpy_read, check_numpy_write, \
@@ -26,9 +27,11 @@
 from ._proxy cimport dset_rw
 
 from ._objects import phil, with_phil
-from cpython cimport PyObject_GetBuffer, \
-                     PyBUF_ANY_CONTIGUOUS, \
-                     PyBuffer_Release
+from cpython cimport PyBUF_ANY_CONTIGUOUS, \
+                     PyBuffer_Release, \
+                     PyBytes_AsString, \
+                     PyBytes_FromStringAndSize, \
+                     PyObject_GetBuffer
 
 
 # Initialization
@@ -516,16 +519,18 @@
 
     IF HDF5_VERSION >= (1, 10, 2):
 
-        def read_direct_chunk(self, offsets, PropID dxpl=None):
-            """ (offsets, PropID dxpl=None)
+        @cython.boundscheck(False)
+        @cython.wraparound(False)
+        def read_direct_chunk(self, offsets, PropID dxpl=None, unsigned 
char[::1] out=None):
+            """ (offsets, PropID dxpl=None, out=None)
 
             Reads data to a bytes array directly from a chunk at position
             specified by the `offsets` argument and bypasses any filters HDF5
             would normally apply to the written data. However, the written data
             may be compressed or not.
 
-            Returns a tuple containing the `filter_mask` and the bytes data
-            which are the raw data storing this chuck.
+            Returns a tuple containing the `filter_mask` and the raw data
+            storing this chunk as bytes if `out` is None, else as a memoryview.
 
             `filter_mask` is a bit field of up to 32 values. It records which
             filters have been applied to this chunk, of the filter pipeline
@@ -534,47 +539,59 @@
             compute the raw data. So the default value of `0` means that all
             defined filters have been applied to the raw data.
 
+            If the `out` argument is not None, it must be a writeable
+            contiguous 1D array-like of bytes (e.g., `bytearray` or
+            `numpy.ndarray`) and large enough to contain the whole chunk.
+
             Feature requires: 1.10.2 HDF5
             """
-
             cdef hid_t dset_id
             cdef hid_t dxpl_id
-            cdef hid_t space_id = 0
+            cdef hid_t space_id
             cdef hsize_t *offset = NULL
-            cdef size_t data_size
             cdef int rank
             cdef uint32_t filters
-            cdef hsize_t read_chunk_nbytes
-            cdef char *data = NULL
-            cdef bytes ret
+            cdef hsize_t chunk_bytes, out_bytes
+            cdef int nb_offsets = len(offsets)
+            cdef void * chunk_buffer
 
             dset_id = self.id
             dxpl_id = pdefault(dxpl)
-            space_id = H5Dget_space(self.id)
+            space_id = H5Dget_space(dset_id)
             rank = H5Sget_simple_extent_ndims(space_id)
+            H5Sclose(space_id)
 
-            if len(offsets) != rank:
-                raise TypeError("offset length (%d) must match dataset rank 
(%d)" % (len(offsets), rank))
+            if nb_offsets != rank:
+                raise TypeError(
+                    f"offsets length ({nb_offsets}) must match dataset rank 
({rank})"
+                )
 
+            offset = <hsize_t*>emalloc(sizeof(hsize_t)*rank)
             try:
-                offset = <hsize_t*>emalloc(sizeof(hsize_t)*rank)
                 convert_tuple(offsets, offset, rank)
-                H5Dget_chunk_storage_size(dset_id, offset, &read_chunk_nbytes)
-                data = <char *>emalloc(read_chunk_nbytes)
+                H5Dget_chunk_storage_size(dset_id, offset, &chunk_bytes)
+
+                if out is None:
+                    retval = PyBytes_FromStringAndSize(NULL, chunk_bytes)
+                    chunk_buffer = PyBytes_AsString(retval)
+                else:
+                    out_bytes = out.shape[0]  # Fast way to get out length
+                    if out_bytes < chunk_bytes:
+                        raise ValueError(
+                            f"out buffer is only {out_bytes} bytes, 
{chunk_bytes} bytes required"
+                        )
+                    retval = memoryview(out[:chunk_bytes])
+                    chunk_buffer = &out[0]
 
                 IF HDF5_VERSION >= (1, 10, 3):
-                    H5Dread_chunk(dset_id, dxpl_id, offset, &filters, data)
+                    H5Dread_chunk(dset_id, dxpl_id, offset, &filters, 
chunk_buffer)
                 ELSE:
-                    H5DOread_chunk(dset_id, dxpl_id, offset, &filters, data)
-                ret = data[:read_chunk_nbytes]
+                    H5DOread_chunk(dset_id, dxpl_id, offset, &filters, 
chunk_buffer)
             finally:
                 efree(offset)
-                if data:
-                    efree(data)
-                if space_id:
-                    H5Sclose(space_id)
 
-            return filters, ret
+            return filters, retval
+
 
     IF HDF5_VERSION >= (1, 10, 5):
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/h5p.pyx new/h5py-3.9.0/h5py/h5p.pyx
--- old/h5py-3.8.0/h5py/h5p.pyx 2023-01-04 18:26:33.000000000 +0100
+++ new/h5py-3.9.0/h5py/h5p.pyx 2023-04-26 17:08:31.000000000 +0200
@@ -1644,6 +1644,27 @@
             H5Idec_ref(fid)
         return propwrap(fid)
 
+
+    @with_phil
+    def set_elink_acc_flags(self, unsigned int flags):
+        """ (UNIT flags)
+
+        Sets the external link traversal file access flag in a link access 
property list.
+        """
+        H5Pset_elink_acc_flags(self.id, flags)
+
+
+    @with_phil
+    def get_elink_acc_flags(self):
+        """() => UINT
+
+        Retrieves the external link traversal file access flag from the 
specified link access property list.
+        """
+        cdef unsigned int flags
+        H5Pget_elink_acc_flags(self.id, &flags)
+        return flags
+
+
 # Datatype creation
 cdef class PropTCID(PropOCID):
     """ Datatype creation property list
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/h5z.pyx new/h5py-3.9.0/h5py/h5z.pyx
--- old/h5py-3.8.0/h5py/h5z.pyx 2021-05-09 13:55:36.000000000 +0200
+++ new/h5py-3.9.0/h5py/h5z.pyx 2023-04-26 17:08:31.000000000 +0200
@@ -16,6 +16,8 @@
 
 # === Public constants and data structures ====================================
 
+CLASS_T_VERS = H5Z_CLASS_T_VERS
+
 FILTER_LZF = H5PY_FILTER_LZF
 
 FILTER_ERROR    = H5Z_FILTER_ERROR
@@ -97,6 +99,27 @@
 
 
 @with_phil
+def register_filter(Py_ssize_t cls_pointer_address):
+    '''(INT cls_pointer_address) => BOOL
+
+    Register a new filter from the memory address of a buffer containing a
+    ``H5Z_class1_t`` or ``H5Z_class2_t`` data structure describing the filter.
+
+    `cls_pointer_address` can be retrieved from a HDF5 filter plugin dynamic
+    library::
+
+        import ctypes
+
+        filter_clib = ctypes.CDLL("/path/to/my_hdf5_filter_plugin.so")
+        filter_clib.H5PLget_plugin_info.restype = ctypes.c_void_p
+
+        h5py.h5z.register_filter(filter_clib.H5PLget_plugin_info())
+
+    '''
+    return <int>H5Zregister(<const void *>cls_pointer_address) >= 0
+
+
+@with_phil
 def unregister_filter(int filter_code):
     '''(INT filter_code) => BOOL
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/ipy_completer.py 
new/h5py-3.9.0/h5py/ipy_completer.py
--- old/h5py-3.8.0/h5py/ipy_completer.py        2022-12-21 17:13:47.000000000 
+0100
+++ new/h5py-3.9.0/h5py/ipy_completer.py        2023-05-16 18:24:09.000000000 
+0200
@@ -128,7 +128,12 @@
     """ Completer function to be loaded into IPython """
     base = re_object_match.split(event.line)[1]
 
-    if not isinstance(self._ofind(base).get('obj'), (AttributeManager, 
HLObject)):
+    try:
+        obj = self._ofind(base).obj
+    except AttributeError:
+        obj = self._ofind(base).get('obj')
+
+    if not isinstance(obj, (AttributeManager, HLObject)):
         raise TryNext
 
     try:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/tests/test_attribute_create.py 
new/h5py-3.9.0/h5py/tests/test_attribute_create.py
--- old/h5py-3.8.0/h5py/tests/test_attribute_create.py  2021-05-09 
13:55:36.000000000 +0200
+++ new/h5py-3.9.0/h5py/tests/test_attribute_create.py  2023-06-07 
13:39:16.000000000 +0200
@@ -89,3 +89,7 @@
         dt = np.dtype('()i')
         with self.assertRaises(ValueError):
             self.f.attrs.create('x', data=array, shape=(5,), dtype=dt)
+
+    def test_key_type(self):
+        with self.assertRaises(TypeError):
+            self.f.attrs.create(1, data=('a', 'b'))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/tests/test_dataset.py 
new/h5py-3.9.0/h5py/tests/test_dataset.py
--- old/h5py-3.8.0/h5py/tests/test_dataset.py   2023-01-10 18:06:37.000000000 
+0100
+++ new/h5py-3.9.0/h5py/tests/test_dataset.py   2023-06-07 13:39:16.000000000 
+0200
@@ -27,7 +27,7 @@
 from .common import ut, TestCase
 from .data_files import get_data_file_path
 from h5py import File, Group, Dataset
-from h5py._hl.base import is_empty_dataspace
+from h5py._hl.base import is_empty_dataspace, product
 from h5py import h5f, h5t
 from h5py.h5py_warnings import H5pyDeprecationWarning
 from h5py import version
@@ -231,7 +231,7 @@
             ((5, 7, 9), (6,), np.s_[2, :6, 3], np.s_[:]),
         ])
     def test_read_direct(self, writable_file, source_shape, dest_shape, 
source_sel, dest_sel):
-        source_values = np.arange(np.product(source_shape), 
dtype="int64").reshape(source_shape)
+        source_values = np.arange(product(source_shape), 
dtype="int64").reshape(source_shape)
         dset = writable_file.create_dataset("dset", source_shape, 
data=source_values)
         arr = np.full(dest_shape, -1, dtype="int64")
         expected = arr.copy()
@@ -280,7 +280,7 @@
         ])
     def test_write_direct(self, writable_file, source_shape, dest_shape, 
source_sel, dest_sel):
         dset = writable_file.create_dataset('dset', dest_shape, dtype='int32', 
fillvalue=-1)
-        arr = np.arange(np.product(source_shape)).reshape(source_shape)
+        arr = np.arange(product(source_shape)).reshape(source_shape)
         expected = np.full(dest_shape, -1, dtype='int32')
         expected[dest_sel] = arr[source_sel]
         dset.write_direct(arr, source_sel, dest_sel)
@@ -1257,12 +1257,15 @@
         # len of ds
         self.assertEqual(10, len(ds.asstr()))
 
-
         # Array output
         np.testing.assert_array_equal(
             ds.asstr()[:1], np.array([data], dtype=object)
         )
 
+        np.testing.assert_array_equal(
+            np.asarray(ds.asstr())[:1], np.array([data], dtype=object)
+        )
+
     def test_asstr_fixed(self):
         dt = h5py.string_dtype(length=5)
         ds = self.f.create_dataset('x', (10,), dtype=dt)
@@ -1569,21 +1572,6 @@
 class TestAstype(BaseDataset):
     """.astype() wrapper & context manager
     """
-    def test_astype_ctx(self):
-        dset = self.f.create_dataset('x', (100,), dtype='i2')
-        dset[...] = np.arange(100)
-
-        with warnings.catch_warnings(record=True) as warn_rec:
-            warnings.simplefilter("always")
-
-            with dset.astype('f8'):
-                self.assertArrayEqual(dset[...], np.arange(100, dtype='f8'))
-
-            with dset.astype('f4') as f4ds:
-                self.assertArrayEqual(f4ds[...], np.arange(100, dtype='f4'))
-
-        assert [w.category for w in warn_rec] == [H5pyDeprecationWarning] * 2
-
     def test_astype_wrapper(self):
         dset = self.f.create_dataset('x', (100,), dtype='i2')
         dset[...] = np.arange(100)
@@ -1596,6 +1584,12 @@
         dset[...] = np.arange(100)
         self.assertEqual(100, len(dset.astype('f4')))
 
+    def test_astype_wrapper_asarray(self):
+        dset = self.f.create_dataset('x', (100,), dtype='i2')
+        dset[...] = np.arange(100)
+        arr = np.asarray(dset.astype('f4'), dtype='i2')
+        self.assertArrayEqual(arr, np.arange(100, dtype='i2'))
+
 
 class TestScalarCompound(BaseDataset):
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/tests/test_filters.py 
new/h5py-3.9.0/h5py/tests/test_filters.py
--- old/h5py-3.8.0/h5py/tests/test_filters.py   2022-12-21 17:13:47.000000000 
+0100
+++ new/h5py-3.9.0/h5py/tests/test_filters.py   2023-04-26 17:08:31.000000000 
+0200
@@ -14,9 +14,8 @@
 import os
 import numpy as np
 import h5py
-import pytest
 
-from .common import ut, TestCase, insubprocess
+from .common import ut, TestCase
 
 
 class TestFilters(TestCase):
@@ -87,14 +86,6 @@
     assert gzip8 != h5py.filters.Gzip(level=7)
 
 
-@pytest.mark.mpi_skip
-@insubprocess
-def test_unregister_filter(request):
-    if h5py.h5z.filter_avail(h5py.h5z.FILTER_LZF):
-        res = h5py.h5z.unregister_filter(h5py.h5z.FILTER_LZF)
-        assert res
-
-
 @ut.skipIf(not os.getenv('H5PY_TEST_CHECK_FILTERS'),  "H5PY_TEST_CHECK_FILTERS 
not set")
 def test_filters_available():
     assert 'gzip' in h5py.filters.decode
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/tests/test_group.py 
new/h5py-3.9.0/h5py/tests/test_group.py
--- old/h5py-3.8.0/h5py/tests/test_group.py     2022-12-21 17:13:47.000000000 
+0100
+++ new/h5py-3.9.0/h5py/tests/test_group.py     2023-06-07 13:39:16.000000000 
+0200
@@ -94,6 +94,11 @@
         self.assertEqual(group.name, name)
         self.assertEqual(group.id.links.get_info(name.encode('utf8')).cset, 
h5t.CSET_ASCII)
 
+    def test_type(self):
+        """ Names should be strings or bytes """
+        with self.assertRaises(TypeError):
+            self.f.create_group(1.)
+
     def test_appropriate_low_level_id(self):
         " Binding a group to a non-group identifier fails with ValueError "
         dset = self.f.create_dataset('foo', [1])
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/tests/test_h5d_direct_chunk.py 
new/h5py-3.9.0/h5py/tests/test_h5d_direct_chunk.py
--- old/h5py-3.8.0/h5py/tests/test_h5d_direct_chunk.py  2022-12-21 
17:13:47.000000000 +0100
+++ new/h5py-3.9.0/h5py/tests/test_h5d_direct_chunk.py  2023-04-26 
17:08:31.000000000 +0200
@@ -1,6 +1,7 @@
 import h5py
 import numpy
 import numpy.testing
+import pytest
 
 from .common import ut, TestCase
 
@@ -113,3 +114,81 @@
         with h5py.File(filename, "r") as filehandle:
             dataset = filehandle["created"][...]
             numpy.testing.assert_array_equal(dataset, frame)
+
+
+@pytest.mark.skipif(
+    h5py.version.hdf5_version_tuple < (1, 10, 2),
+    reason="Direct chunk read requires HDF5 >= 1.10.2"
+)
+class TestReadDirectChunkToOut:
+
+    def test_uncompressed_data(self, writable_file):
+        ref_data = numpy.arange(16).reshape(4, 4)
+        dataset = writable_file.create_dataset(
+            "uncompressed", data=ref_data, chunks=ref_data.shape)
+
+        out = bytearray(ref_data.nbytes)
+        filter_mask, chunk = dataset.id.read_direct_chunk((0, 0), out=out)
+
+        assert numpy.array_equal(
+            numpy.frombuffer(out, 
dtype=ref_data.dtype).reshape(ref_data.shape),
+            ref_data,
+        )
+        assert filter_mask == 0
+        assert len(chunk) == ref_data.nbytes
+
+    @pytest.mark.skipif(
+        h5py.version.hdf5_version_tuple < (1, 10, 5),
+        reason="chunk info requires HDF5 >= 1.10.5",
+    )
+    @pytest.mark.skipif(
+        'gzip' not in h5py.filters.encode,
+        reason="DEFLATE is not installed",
+    )
+    def test_compressed_data(self, writable_file):
+        ref_data = numpy.arange(16).reshape(4, 4)
+        dataset = writable_file.create_dataset(
+            "gzip",
+            data=ref_data,
+            chunks=ref_data.shape,
+            compression="gzip",
+            compression_opts=9,
+        )
+        chunk_info = dataset.id.get_chunk_info(0)
+
+        out = bytearray(chunk_info.size)
+        filter_mask, chunk = dataset.id.read_direct_chunk(
+            chunk_info.chunk_offset,
+            out=out,
+        )
+        assert filter_mask == chunk_info.filter_mask
+        assert len(chunk) == chunk_info.size
+        assert out == dataset.id.read_direct_chunk(chunk_info.chunk_offset)[1]
+
+    def test_fail_buffer_too_small(self, writable_file):
+        ref_data = numpy.arange(16).reshape(4, 4)
+        dataset = writable_file.create_dataset(
+            "uncompressed", data=ref_data, chunks=ref_data.shape)
+
+        out = bytearray(ref_data.nbytes // 2)
+        with pytest.raises(ValueError):
+            dataset.id.read_direct_chunk((0, 0), out=out)
+
+    def test_fail_buffer_readonly(self, writable_file):
+        ref_data = numpy.arange(16).reshape(4, 4)
+        dataset = writable_file.create_dataset(
+            "uncompressed", data=ref_data, chunks=ref_data.shape)
+
+        out = bytes(ref_data.nbytes)
+        with pytest.raises(BufferError):
+            dataset.id.read_direct_chunk((0, 0), out=out)
+
+    def test_fail_buffer_not_contiguous(self, writable_file):
+        ref_data = numpy.arange(16).reshape(4, 4)
+        dataset = writable_file.create_dataset(
+            "uncompressed", data=ref_data, chunks=ref_data.shape)
+
+        array = numpy.empty(ref_data.shape + (2,), dtype=ref_data.dtype)
+        out = array[:, :, ::2]  # Array is not contiguous
+        with pytest.raises(ValueError):
+            dataset.id.read_direct_chunk((0, 0), out=out)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/tests/test_h5z.py 
new/h5py-3.9.0/h5py/tests/test_h5z.py
--- old/h5py-3.8.0/h5py/tests/test_h5z.py       1970-01-01 01:00:00.000000000 
+0100
+++ new/h5py-3.9.0/h5py/tests/test_h5z.py       2023-04-26 17:08:31.000000000 
+0200
@@ -0,0 +1,85 @@
+from ctypes import (
+    addressof,
+    c_char_p,
+    c_int,
+    c_long,
+    c_uint,
+    c_void_p,
+    CFUNCTYPE,
+    POINTER,
+    Structure,
+)
+import pytest
+import h5py
+from h5py import h5z
+
+from .common import insubprocess
+
+
+# Type of filter callback function of H5Z_class2_t
+H5ZFuncT = CFUNCTYPE(
+    c_long,  # restype
+    # argtypes
+    c_uint,  # flags
+    c_long,  # cd_nelemts
+    POINTER(c_uint),  # cd_values
+    c_long,  # nbytes
+    POINTER(c_long),  # buf_size
+    POINTER(c_void_p),  # buf
+)
+
+
+class H5ZClass2T(Structure):
+    """H5Z_class2_t structure defining a filter"""
+
+    _fields_ = [
+        ("version", c_int),
+        ("id_", c_int),
+        ("encoder_present", c_uint),
+        ("decoder_present", c_uint),
+        ("name", c_char_p),
+        ("can_apply", c_void_p),
+        ("set_local", c_void_p),
+        ("filter_", H5ZFuncT),
+    ]
+
+
+def test_register_filter():
+    filter_id = 256  # Test ID
+
+    @H5ZFuncT
+    def failing_filter_callback(flags, cd_nelemts, cd_values, nbytes, 
buf_size, buf):
+        return 0
+
+    dummy_filter_class = H5ZClass2T(
+        version=h5z.CLASS_T_VERS,
+        id_=filter_id,
+        encoder_present=1,
+        decoder_present=1,
+        name=b"dummy filter",
+        can_apply=None,
+        set_local=None,
+        filter_=failing_filter_callback,
+    )
+
+    h5z.register_filter(addressof(dummy_filter_class))
+
+    try:
+        assert h5z.filter_avail(filter_id)
+        filter_flags = h5z.get_filter_info(filter_id)
+        assert (
+            filter_flags
+            == h5z.FILTER_CONFIG_ENCODE_ENABLED | 
h5z.FILTER_CONFIG_DECODE_ENABLED
+        )
+    finally:
+        h5z.unregister_filter(filter_id)
+
+    assert not h5z.filter_avail(filter_id)
+
+
+@pytest.mark.mpi_skip
+@insubprocess
+def test_unregister_filter(request):
+    if h5py.h5z.filter_avail(h5py.h5z.FILTER_LZF):
+        res = h5py.h5z.unregister_filter(h5py.h5z.FILTER_LZF)
+        assert res
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py/version.py 
new/h5py-3.9.0/h5py/version.py
--- old/h5py-3.8.0/h5py/version.py      2023-01-23 11:18:46.000000000 +0100
+++ new/h5py-3.9.0/h5py/version.py      2023-06-20 10:56:46.000000000 +0200
@@ -23,7 +23,7 @@
 
 hdf5_built_version_tuple = _h5.HDF5_VERSION_COMPILED_AGAINST
 
-version_tuple = _H5PY_VERSION_CLS(3, 8, 0, None, None, None)
+version_tuple = _H5PY_VERSION_CLS(3, 9, 0, None, None, None)
 
 version = "{0.major:d}.{0.minor:d}.{0.bugfix:d}".format(version_tuple)
 if version_tuple.pre is not None:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py.egg-info/PKG-INFO 
new/h5py-3.9.0/h5py.egg-info/PKG-INFO
--- old/h5py-3.8.0/h5py.egg-info/PKG-INFO       2023-01-23 11:21:35.000000000 
+0100
+++ new/h5py-3.9.0/h5py.egg-info/PKG-INFO       2023-06-20 11:06:12.000000000 
+0200
@@ -1,6 +1,6 @@
 Metadata-Version: 2.1
 Name: h5py
-Version: 3.8.0
+Version: 3.9.0
 Summary: Read and write HDF5 files from Python
 Author-email: Andrew Collette <andrew.colle...@gmail.com>
 Maintainer-email: Thomas Kluyver <tho...@kluyver.me.uk>, Thomas A Caswell 
<tcasw...@bnl.gov>
@@ -26,7 +26,7 @@
 Classifier: Topic :: Scientific/Engineering
 Classifier: Topic :: Database
 Classifier: Topic :: Software Development :: Libraries :: Python Modules
-Requires-Python: >=3.7
+Requires-Python: >=3.8
 Description-Content-Type: text/x-rst
 License-File: LICENSE
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py.egg-info/SOURCES.txt 
new/h5py-3.9.0/h5py.egg-info/SOURCES.txt
--- old/h5py-3.8.0/h5py.egg-info/SOURCES.txt    2023-01-23 11:21:35.000000000 
+0100
+++ new/h5py-3.9.0/h5py.egg-info/SOURCES.txt    2023-06-20 11:06:12.000000000 
+0200
@@ -1,4 +1,3 @@
-.gitignore
 LICENSE
 MANIFEST.in
 README.rst
@@ -60,6 +59,7 @@
 docs/whatsnew/3.6.rst
 docs/whatsnew/3.7.rst
 docs/whatsnew/3.8.rst
+docs/whatsnew/3.9.rst
 docs/whatsnew/index.rst
 docs_api/Makefile
 docs_api/automod.py
@@ -202,6 +202,7 @@
 h5py/tests/test_h5p.py
 h5py/tests/test_h5pl.py
 h5py/tests/test_h5t.py
+h5py/tests/test_h5z.py
 h5py/tests/test_objects.py
 h5py/tests/test_ros3.py
 h5py/tests/test_selections.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/h5py.egg-info/requires.txt 
new/h5py-3.9.0/h5py.egg-info/requires.txt
--- old/h5py-3.8.0/h5py.egg-info/requires.txt   2023-01-23 11:21:35.000000000 
+0100
+++ new/h5py-3.9.0/h5py.egg-info/requires.txt   2023-06-20 11:06:12.000000000 
+0200
@@ -1 +1 @@
-numpy>=1.14.5
+numpy>=1.17.3
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/pyproject.toml 
new/h5py-3.9.0/pyproject.toml
--- old/h5py-3.8.0/pyproject.toml       2023-01-20 17:24:30.000000000 +0100
+++ new/h5py-3.9.0/pyproject.toml       2023-06-07 13:39:16.000000000 +0200
@@ -1,6 +1,6 @@
 [build-system]
 requires = [
-    "Cython ~=0.29",
+    "Cython >=0.29.31,<1",
     "oldest-supported-numpy",
     "pkgconfig",
     "setuptools >=61",
@@ -36,7 +36,7 @@
     "Topic :: Database",
     "Topic :: Software Development :: Libraries :: Python Modules",
 ]
-requires-python = ">=3.7"
+requires-python = ">=3.8"
 dynamic = ["dependencies", "version"]
 
 [project.readme]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/setup.py new/h5py-3.9.0/setup.py
--- old/h5py-3.8.0/setup.py     2023-01-23 11:18:46.000000000 +0100
+++ new/h5py-3.9.0/setup.py     2023-06-20 10:56:46.000000000 +0200
@@ -20,18 +20,18 @@
 import setup_build, setup_configure
 
 
-VERSION = '3.8.0'
+VERSION = '3.9.0'
 
 
 # these are required to use h5py
 RUN_REQUIRES = [
     # We only really aim to support NumPy & Python combinations for which
-    # there are wheels on PyPI (e.g. NumPy >=1.17.5 for Python 3.8).
+    # there are wheels on PyPI (e.g. NumPy >=1.23.2 for Python 3.11).
     # But we don't want to duplicate the information in oldest-supported-numpy
     # here, and if you can build an older NumPy on a newer Python, h5py 
probably
     # works (assuming you build it from source too).
-    # NumPy 1.14.5 is the first with wheels for Python 3.7, our minimum Python.
-    "numpy >=1.14.5",
+    # NumPy 1.17.3 is the first with wheels for Python 3.8, our minimum Python.
+    "numpy >=1.17.3",
 ]
 
 # Packages needed to build h5py (in addition to static list in pyproject.toml)
@@ -43,10 +43,10 @@
 SETUP_REQUIRES = []
 
 if setup_configure.mpi_enabled():
-    RUN_REQUIRES.append('mpi4py >=3.0.2')
-    SETUP_REQUIRES.append("mpi4py ==3.0.2; python_version<'3.8'")
-    SETUP_REQUIRES.append("mpi4py ==3.0.3; python_version=='3.8.*'")
-    SETUP_REQUIRES.append("mpi4py ==3.1.0; python_version=='3.9.*' or 
python_version=='3.10.*'")
+    # mpi4py 3.1.1 fixed a typo in python_requires, which made older versions
+    # incompatible with newer setuptools.
+    RUN_REQUIRES.append('mpi4py >=3.1.1')
+    SETUP_REQUIRES.append("mpi4py ==3.1.1; python_version<'3.11'")
     SETUP_REQUIRES.append("mpi4py ==3.1.4; python_version>='3.11'")
 
 # Set the environment variable H5PY_SETUP_REQUIRES=0 if we need to skip
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/setup_configure.py 
new/h5py-3.9.0/setup_configure.py
--- old/h5py-3.8.0/setup_configure.py   2023-01-04 18:26:33.000000000 +0100
+++ new/h5py-3.9.0/setup_configure.py   2023-06-07 13:39:16.000000000 +0200
@@ -288,6 +288,7 @@
             lib = ctypes.CDLL(path, **load_kw)
         except Exception:
             print("error: Unable to load dependency HDF5, make sure HDF5 is 
installed properly")
+            print("Library dirs checked:", libdirs)
             raise
 
         self._lib = lib
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/h5py-3.8.0/tox.ini new/h5py-3.9.0/tox.ini
--- old/h5py-3.8.0/tox.ini      2023-01-20 17:24:30.000000000 +0100
+++ new/h5py-3.9.0/tox.ini      2023-06-07 13:39:16.000000000 +0200
@@ -2,7 +2,7 @@
 # We want an envlist like
 # envlist = 
{py36,py37,pypy3}-{test}-{deps,mindeps}-{,mpi4py}-{,pre},nightly,docs,checkreadme,pre-commit
 # but we want to skip mpi and pre by default, so this envlist is below
-envlist = 
{py37,py38,py39,py310,py311,pypy3}-{test}-{deps,mindeps},nightly,docs,apidocs,checkreadme,pre-commit,rever
+envlist = 
{py38,py39,py310,py311,pypy3}-{test}-{deps,mindeps},nightly,docs,apidocs,checkreadme,pre-commit,rever
 isolated_build = True
 
 [testenv]
@@ -11,7 +11,6 @@
     test: pytest-cov
     test: pytest-mpi>=0.2
 
-    py37-deps: numpy>=1.14.5
     py38-deps: numpy>=1.17.5
     py39-deps: numpy>=1.19.3
     py310-deps: numpy>=1.21.3
@@ -19,7 +18,7 @@
 
     mindeps: oldest-supported-numpy
 
-    mpi4py: mpi4py>=3.0.2
+    mpi4py: mpi4py>=3.1.1
 
     tables-deps: tables>=3.4.4
     tables-mindeps: tables==3.4.4

Reply via email to