Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-fsspec for openSUSE:Factory 
checked in at 2022-10-17 14:57:30
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-fsspec (Old)
 and      /work/SRC/openSUSE:Factory/.python-fsspec.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-fsspec"

Mon Oct 17 14:57:30 2022 rev:21 rq:1010953 version:2022.8.2

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-fsspec/python-fsspec.changes      
2022-07-04 11:32:45.140011278 +0200
+++ /work/SRC/openSUSE:Factory/.python-fsspec.new.2275/python-fsspec.changes    
2022-10-17 14:57:33.222075881 +0200
@@ -1,0 +2,52 @@
+Fri Oct 14 11:11:45 UTC 2022 - Ben Greiner <c...@bnavigator.de>
+
+- Don't test with python-s3fs: It is pinning aiobotocore which
+  does not play well with a rolling distro
+  gh#fsspec/s3fs#615, gh#aio-libs/aiobotocore#971
+
+-------------------------------------------------------------------
+Wed Sep 28 19:37:07 UTC 2022 - Yogalakshmi Arunachalam <yarunacha...@suse.com>
+
+- Update to 2022.8.2
+  * don???t close OpenFile on del (#1035)
+
+- Update to 2022.8.1
+  * revert #1024 (#1029), with strciter requirements on OpenFile usage
+
+- Update to 2022.8.0
+  Enhancements
+  * writable ZipFileSystem (#1017)
+  * make OpenFile behave like files and remove dynamic closer in .open() 
(#1024)
+  * use isal gunzip (#1008)
+  Fixes
+  * remove strip from _parent (#1022)
+  * disallow aiohttp prereleases (#1018)
+  * be sure to close cached file (#1016)
+  * async rm in reverse order (#1014)
+  * expose fileno in LocalFileOpener (#1010, #1005)
+  * remove temp files with simplecache writing (#1006)
+  * azure paths (#1003)
+  copy dircache keys before iter
+
+- Update to 2022.7.1
+  Fixes
+  * Remove fspath from LocalFileOpener (#1005)
+  * Revert 988 (#1003)
+
+- Update to 2022.7.0
+  Enhancements
+  * added fsspec-xrootd implementation to registry (#1000)
+  * memory file not to copy bytes (#999)
+  * Filie details passed to FUSE (#972)
+  Fixes
+  * Return info for root path of archives (#996)
+  * arbitrary kwargs passed through in pipe_file (#993)
+  * special cases for host in URLs for azure (#988)
+  * unstrip protocol criterion (#980)
+  * HTTPFile serialisation (#973)
+  Other
+  * Show erroring path in FileNotFounds (#989)
+  * Reference file info without searching directory tree (#985)
+  * Truncate for local files (#975) 
+
+-------------------------------------------------------------------

Old:
----
  fsspec-2022.5.0.tar.gz

New:
----
  fsspec-2022.8.2.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-fsspec.spec ++++++
--- /var/tmp/diff_new_pack.wN4oFQ/_old  2022-10-17 14:57:33.990077356 +0200
+++ /var/tmp/diff_new_pack.wN4oFQ/_new  2022-10-17 14:57:33.998077371 +0200
@@ -27,13 +27,13 @@
 %endif
 %define         skip_python2 1
 Name:           python-fsspec%{psuffix}
-Version:        2022.5.0
+Version:        2022.8.2
 Release:        0
 Summary:        Filesystem specification package
 License:        BSD-3-Clause
 URL:            https://github.com/fsspec/filesystem_spec
 # the tests are only in the GitHub archive
-Source:         %{url}/archive/%{version}.tar.gz#/fsspec-%{version}.tar.gz
+Source:         
https://github.com/fsspec/filesystem_spec/archive/%{version}.tar.gz#/fsspec-%{version}.tar.gz
 BuildRequires:  %{python_module base >= 3.6}
 BuildRequires:  %{python_module importlib_metadata if %python-base < 3.8}
 BuildRequires:  %{python_module setuptools}
@@ -60,7 +60,7 @@
 %if %{with test}
 BuildRequires:  %{python_module aiohttp}
 BuildRequires:  %{python_module cloudpickle}
-BuildRequires:  %{python_module distributed if %python-base < 3.10}
+BuildRequires:  %{python_module distributed}
 BuildRequires:  %{python_module fusepy}
 BuildRequires:  %{python_module gcsfs}
 BuildRequires:  %{python_module notebook}
@@ -72,7 +72,8 @@
 BuildRequires:  %{python_module pytest}
 BuildRequires:  %{python_module python-snappy}
 BuildRequires:  %{python_module requests}
-BuildRequires:  %{python_module s3fs}
+# Too tight of a aiobotocore pinning: gh#fsspec/s3fs#615, 
gh#aio-libs/aiobotocore#971
+#BuildRequires:  %%{python_module s3fs}
 BuildRequires:  %{python_module smbprotocol}
 BuildRequires:  %{python_module zstandard}
 # cannot test git and http in the same installation (?)

++++++ fsspec-2022.5.0.tar.gz -> fsspec-2022.8.2.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/.github/workflows/main.yaml 
new/filesystem_spec-2022.8.2/.github/workflows/main.yaml
--- old/filesystem_spec-2022.5.0/.github/workflows/main.yaml    2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/.github/workflows/main.yaml    2022-09-01 
03:03:54.000000000 +0200
@@ -13,7 +13,7 @@
     strategy:
       fail-fast: false
       matrix:
-        TOXENV: [py37, py38, py39, s3fs, gcsfs]
+        TOXENV: [py38, py39, py310, s3fs, gcsfs]
 
     env:
       TOXENV: ${{ matrix.TOXENV }}
@@ -42,7 +42,7 @@
     strategy:
       fail-fast: false
       matrix:
-        TOXENV: [py38]
+        TOXENV: [py39]
 
     env:
       TOXENV: ${{ matrix.TOXENV }}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/docs/source/changelog.rst 
new/filesystem_spec-2022.8.2/docs/source/changelog.rst
--- old/filesystem_spec-2022.5.0/docs/source/changelog.rst      2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/docs/source/changelog.rst      2022-09-01 
03:03:54.000000000 +0200
@@ -1,6 +1,69 @@
 Changelog
 =========
 
+2022.8.2
+--------
+
+- don't close OpenFile on del (#1035)
+
+2022.8.1
+--------
+
+- revert #1024 (#1029), with strciter requirements on OpenFile usage
+
+2022.8.0
+--------
+
+Enhancements
+
+- writable ZipFileSystem (#1017)
+- make OpenFile behave like files and remove dynamic closer in .open() (#1024)
+- use isal gunzip (#1008)
+
+Fixes
+
+- remove strip from _parent (#1022)
+- disallow aiohttp prereleases (#1018)
+- be sure to close cached file (#1016)
+- async rm in reverse order (#1014)
+- expose fileno in LocalFileOpener (#1010, #1005)
+- remove temp files with simplecache writing (#1006)
+- azure paths (#1003)
+- copy dircache keys before iter
+
+
+2022.7.1
+--------
+
+Fixes
+
+- Remove fspath from LocalFileOpener (#1005)
+- Revert 988 (#1003)
+
+2022.7.0
+--------
+
+Enhancements
+
+- added fsspec-xrootd implementation to registry (#1000)
+- memory file not to copy bytes (#999)
+- Filie details passed to FUSE (#972)
+
+Fixes
+
+- Return info for root path of archives (#996)
+- arbitrary kwargs passed through in pipe_file (#993)
+- special cases for host in URLs for azure (#988)
+- unstrip protocol criterion (#980)
+- HTTPFile serialisation (#973)
+
+Other
+
+- Show erroring path in FileNotFounds (#989)
+- Reference file info without searching directory tree (#985)
+- Truncate for local files (#975)
+
+
 2022.5.0
 --------
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/docs/source/developer.rst 
new/filesystem_spec-2022.8.2/docs/source/developer.rst
--- old/filesystem_spec-2022.5.0/docs/source/developer.rst      2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/docs/source/developer.rst      2022-09-01 
03:03:54.000000000 +0200
@@ -24,7 +24,7 @@
 In cases where the caller wants to control the context directly, they can use 
the
 ``open`` method of the ``OpenFile``, or get the filesystem object directly,
 skipping the ``OpenFile`` route. In the latter case, text encoding and 
compression
-or **not** handled for you. The file-like object can also be used as a context
+are **not** handled for you. The file-like object can also be used as a context
 manager, or the ``close()`` method must be called explicitly to release 
resources.
 
 .. code-block:: python
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/_version.py 
new/filesystem_spec-2022.8.2/fsspec/_version.py
--- old/filesystem_spec-2022.5.0/fsspec/_version.py     2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/_version.py     2022-09-01 
03:03:54.000000000 +0200
@@ -22,9 +22,9 @@
     # setup.py/versioneer.py will grep for the variable names, so they must
     # each be defined on a line of their own. _version.py will just call
     # get_keywords().
-    git_refnames = " (tag: 2022.5.0)"
-    git_full = "148a6861481f824afb88c7c50955aa6ed4e25d32"
-    git_date = "2022-05-19 14:13:38 -0400"
+    git_refnames = " (tag: 2022.8.2)"
+    git_full = "025d846db553f7498c106326ac005656ebfa3bb7"
+    git_date = "2022-08-31 21:03:54 -0400"
     keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
     return keywords
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/archive.py 
new/filesystem_spec-2022.8.2/fsspec/archive.py
--- old/filesystem_spec-2022.5.0/fsspec/archive.py      2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/archive.py      2022-09-01 
03:03:54.000000000 +0200
@@ -34,6 +34,8 @@
     def info(self, path, **kwargs):
         self._get_dirs()
         path = self._strip_protocol(path)
+        if path in {"", "/"} and self.dir_cache:
+            return {"name": "/", "type": "directory", "size": 0}
         if path in self.dir_cache:
             return self.dir_cache[path]
         elif path + "/" in self.dir_cache:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/asyn.py 
new/filesystem_spec-2022.8.2/fsspec/asyn.py
--- old/filesystem_spec-2022.5.0/fsspec/asyn.py 2022-05-19 20:13:38.000000000 
+0200
+++ new/filesystem_spec-2022.8.2/fsspec/asyn.py 2022-09-01 03:03:54.000000000 
+0200
@@ -16,6 +16,33 @@
 from .utils import is_exception, other_paths
 
 private = re.compile("_[^_]")
+iothread = [None]  # dedicated fsspec IO thread
+loop = [None]  # global event loop for any non-async instance
+_lock = None  # global lock placeholder
+
+
+def get_lock():
+    """Allocate or return a threading lock.
+
+    The lock is allocatted on first use to allow setting one lock per forked 
process.
+    """
+    global _lock
+    if not _lock:
+        _lock = threading.Lock()
+    return _lock
+
+
+def reset_lock():
+    """Reset the global lock.
+
+    This should be called only on the init of a forked process to reset the 
lock to
+    None, enabling the new forked process to get a new lock.
+    """
+    global _lock
+
+    iothread[0] = None
+    loop[0] = None
+    _lock = None
 
 
 async def _runner(event, coro, result, timeout=None):
@@ -33,6 +60,9 @@
 def sync(loop, func, *args, timeout=None, **kwargs):
     """
     Make loop run coroutine until it returns. Runs in other thread
+
+    Example usage:
+        fsspec.asyn.sync(fsspec.asyn.get_loop(), func, *args, timeout=timeout, 
**kwargs)
     """
     timeout = timeout if timeout else None  # convert 0 or 0.0 to None
     # NB: if the loop is not running *yet*, it is OK to submit work
@@ -68,11 +98,6 @@
         return return_result
 
 
-iothread = [None]  # dedicated fsspec IO thread
-loop = [None]  # global event loop for any non-async instance
-lock = threading.Lock()  # for setting exactly one thread
-
-
 def sync_wrapper(func, obj=None):
     """Given a function, make so can be called in async or blocking contexts
 
@@ -121,7 +146,7 @@
     The loop will be running on a separate thread.
     """
     if loop[0] is None:
-        with lock:
+        with get_lock():
             # repeat the check just in case the loop got filled between the
             # previous two calls from another thread
             if loop[0] is None:
@@ -308,7 +333,7 @@
         batch_size = batch_size or self.batch_size
         path = await self._expand_path(path, recursive=recursive)
         return await _run_coros_in_chunks(
-            [self._rm_file(p, **kwargs) for p in path],
+            [self._rm_file(p, **kwargs) for p in reversed(path)],
             batch_size=batch_size,
             nofiles=True,
         )
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/compression.py 
new/filesystem_spec-2022.8.2/fsspec/compression.py
--- old/filesystem_spec-2022.5.0/fsspec/compression.py  2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/compression.py  2022-09-01 
03:03:54.000000000 +0200
@@ -73,12 +73,10 @@
 try:  # pragma: no cover
     from isal import igzip
 
-    # igzip is meant to be used as a faster drop in replacement to gzip
-    # so its api and functions are the same as the stdlib???s module. Except
-    # where ISA-L does not support the same calls as zlib
-    # (See https://python-isal.readthedocs.io/).
+    def isal(infile, mode="rb", **kwargs):
+        return igzip.IGzipFile(fileobj=infile, mode=mode, **kwargs)
 
-    register_compression("gzip", igzip.IGzipFile, "gz")
+    register_compression("gzip", isal, "gz")
 except ImportError:
     from gzip import GzipFile
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/core.py 
new/filesystem_spec-2022.8.2/fsspec/core.py
--- old/filesystem_spec-2022.5.0/fsspec/core.py 2022-05-19 20:13:38.000000000 
+0200
+++ new/filesystem_spec-2022.8.2/fsspec/core.py 2022-09-01 03:03:54.000000000 
+0200
@@ -18,7 +18,6 @@
 from .compression import compr
 from .registry import filesystem, get_filesystem_class
 from .utils import (
-    IOWrapper,
     _unstrip_protocol,
     build_name_function,
     infer_compression,
@@ -29,7 +28,7 @@
 logger = logging.getLogger("fsspec")
 
 
-class OpenFile(object):
+class OpenFile:
     """
     File-like object to be used in a context
 
@@ -56,6 +55,10 @@
         How to handle encoding errors if opened in text mode.
     newline: None or str
         Passed to TextIOWrapper in text mode, how to handle line endings.
+    autoopen: bool
+        If True, calls open() immediately. Mostly used by pickle
+    pos: int
+        If given and autoopen is True, seek to this location immediately
     """
 
     def __init__(
@@ -94,10 +97,6 @@
     def __repr__(self):
         return "<OpenFile '{}'>".format(self.path)
 
-    def __fspath__(self):
-        # may raise if cannot be resolved to local file
-        return self.open().__fspath__()
-
     def __enter__(self):
         mode = self.mode.replace("t", "").replace("b", "") + "b"
 
@@ -112,7 +111,7 @@
 
         if "b" not in self.mode:
             # assume, for example, that 'r' is equivalent to 'rt' as in builtin
-            f = io.TextIOWrapper(
+            f = PickleableTextIOWrapper(
                 f, encoding=self.encoding, errors=self.errors, 
newline=self.newline
             )
             self.fobjects.append(f)
@@ -122,10 +121,6 @@
     def __exit__(self, *args):
         self.close()
 
-    def __del__(self):
-        if hasattr(self, "fobjects"):
-            self.fobjects.clear()  # may cause cleanup of objects and close 
files
-
     @property
     def full_name(self):
         return _unstrip_protocol(self.path, self.fs)
@@ -133,31 +128,19 @@
     def open(self):
         """Materialise this as a real open file without context
 
-        The file should be explicitly closed to avoid enclosed file
-        instances persisting. This code-path monkey-patches the file-like
-        objects, so they can close even if the parent OpenFile object has 
already
-        been deleted; but a with-context is better style.
+        The OpenFile object should be explicitly closed to avoid enclosed file
+        instances persisting. You must, therefore, keep a reference to the 
OpenFile
+        during the life of the file-like it generates.
         """
-        out = self.__enter__()
-        closer = out.close
-        fobjects = self.fobjects.copy()[:-1]
-        mode = self.mode
-
-        def close():
-            # this func has no reference to
-            closer()  # original close bound method of the final file-like
-            _close(fobjects, mode)  # call close on other dependent file-likes
-
-        try:
-            out.close = close
-        except AttributeError:
-            out = IOWrapper(out, lambda: _close(fobjects, mode))
-
-        return out
+        return self.__enter__()
 
     def close(self):
         """Close all encapsulated file objects"""
-        _close(self.fobjects, self.mode)
+        for f in reversed(self.fobjects):
+            if "r" not in self.mode and not f.closed:
+                f.flush()
+            f.close()
+        self.fobjects.clear()
 
 
 class OpenFiles(list):
@@ -196,18 +179,17 @@
 
     def __exit__(self, *args):
         fs = self.fs
+        [s.__exit__(*args) for s in self]
         if "r" not in self.mode:
             while True:
                 if hasattr(fs, "open_many"):
                     # check for concurrent cache upload
                     fs.commit_many(self.files)
-                    self.files.clear()
                     return
                 if hasattr(fs, "fs") and fs.fs is not None:
                     fs = fs.fs
                 else:
                     break
-        [s.__exit__(*args) for s in self]
 
     def __getitem__(self, item):
         out = super().__getitem__(item)
@@ -219,14 +201,6 @@
         return "<List of %s OpenFile instances>" % len(self)
 
 
-def _close(fobjects, mode):
-    for f in reversed(fobjects):
-        if "r" not in mode and not f.closed:
-            f.flush()
-        f.close()
-    fobjects.clear()
-
-
 def open_files(
     urlpath,
     mode="rb",
@@ -705,3 +679,26 @@
             "3. A path with a '*' in it: 'foo.*.json'"
         )
     return paths
+
+
+class PickleableTextIOWrapper(io.TextIOWrapper):
+    """TextIOWrapper cannot be pickled. This solves it.
+
+    Requires that ``buffer`` be pickleable, which all instances of
+    AbstractBufferedFile are.
+    """
+
+    def __init__(
+        self,
+        buffer,
+        encoding=None,
+        errors=None,
+        newline=None,
+        line_buffering=False,
+        write_through=False,
+    ):
+        self.args = buffer, encoding, errors, newline, line_buffering, 
write_through
+        super().__init__(*self.args)
+
+    def __reduce__(self):
+        return PickleableTextIOWrapper, self.args
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/dircache.py 
new/filesystem_spec-2022.8.2/fsspec/dircache.py
--- old/filesystem_spec-2022.5.0/fsspec/dircache.py     2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/dircache.py     2022-09-01 
03:03:54.000000000 +0200
@@ -87,7 +87,9 @@
         del self._cache[key]
 
     def __iter__(self):
-        return (k for k in self._cache if k in self)
+        entries = list(self._cache)
+
+        return (k for k in entries if k in self)
 
     def __reduce__(self):
         return (
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/fuse.py 
new/filesystem_spec-2022.8.2/fsspec/fuse.py
--- old/filesystem_spec-2022.5.0/fsspec/fuse.py 2022-05-19 20:13:38.000000000 
+0200
+++ new/filesystem_spec-2022.8.2/fsspec/fuse.py 2022-09-01 03:03:54.000000000 
+0200
@@ -46,9 +46,9 @@
             data["st_size"] = info["size"]
             data["st_blksize"] = 5 * 2**20
             data["st_nlink"] = 1
-        data["st_atime"] = time.time()
-        data["st_ctime"] = time.time()
-        data["st_mtime"] = time.time()
+        data["st_atime"] = info["atime"] if "atime" in info else time.time()
+        data["st_ctime"] = info["ctime"] if "ctime" in info else time.time()
+        data["st_mtime"] = info["mtime"] if "mtime" in info else time.time()
         return data
 
     def readdir(self, path, fh):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/cached.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/cached.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/cached.py       
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/cached.py       
2022-09-01 03:03:54.000000000 +0200
@@ -497,8 +497,7 @@
             self._mkcache()
         else:
             return [
-                LocalTempFile(self.fs, path, mode=open_files.mode, 
autocommit=False)
-                for path in paths
+                LocalTempFile(self.fs, path, mode=open_files.mode) for path in 
paths
             ]
 
         if self.compression:
@@ -541,6 +540,13 @@
 
     def commit_many(self, open_files):
         self.fs.put([f.fn for f in open_files], [f.path for f in open_files])
+        [f.close() for f in open_files]
+        for f in open_files:
+            # in case autocommit is off, and so close did not already delete
+            try:
+                os.remove(f.name)
+            except FileNotFoundError:
+                pass
 
     def _make_local_details(self, path):
         hash = self.hash_name(path, self.same_names)
@@ -586,7 +592,8 @@
         out = {}
         callback.set_size(len(paths))
         for p, fn in zip(paths, fns):
-            out[p] = open(fn, "rb").read()
+            with open(fn, "rb") as f:
+                out[p] = f.read()
             callback.relative_update(1)
         if isinstance(path, str) and len(paths) == 1 and recursive is False:
             out = out[paths[0]]
@@ -595,7 +602,7 @@
     def _open(self, path, mode="rb", **kwargs):
         path = self._strip_protocol(path)
         if "r" not in mode:
-            return self.fs._open(path, mode=mode, **kwargs)
+            return LocalTempFile(self, path, mode=mode)
         detail = self._check_file(path)
         if detail:
             detail, fn = detail
@@ -755,6 +762,8 @@
         self.close()
 
     def close(self):
+        if self.closed:
+            return
         self.fh.close()
         self.closed = True
         if self.autocommit:
@@ -766,13 +775,15 @@
 
     def commit(self):
         self.fs.put(self.fn, self.path)
+        try:
+            os.remove(self.fn)
+        except (PermissionError, FileNotFoundError):
+            # file path may be held by new version of the file on windows
+            pass
 
     @property
     def name(self):
-        if isinstance(self.fh.name, str):
-            return self.fh.name  # initialized by open()
-        else:
-            return self.fn  # initialized by tempfile.mkstemp()
+        return self.fn
 
     def __getattr__(self, item):
         return getattr(self.fh, item)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/ftp.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/ftp.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/ftp.py  2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/ftp.py  2022-09-01 
03:03:54.000000000 +0200
@@ -110,7 +110,7 @@
                     if info["type"] == "file":
                         out = [(path, info)]
                 except (Error, IndexError):
-                    raise FileNotFoundError
+                    raise FileNotFoundError(path)
         files = self.dircache.get(path, out)
         if not detail:
             return sorted([fn for fn, details in files])
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/http.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/http.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/http.py 2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/http.py 2022-09-01 
03:03:54.000000000 +0200
@@ -642,7 +642,7 @@
                 self.url,
                 self.mode,
                 self.blocksize,
-                self.cache.name,
+                self.cache.name if self.cache else "none",
                 self.size,
             ),
         )
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/libarchive.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/libarchive.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/libarchive.py   
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/libarchive.py   
2022-09-01 03:03:54.000000000 +0200
@@ -128,6 +128,7 @@
                     'one file: "{}"'.format(fo, files)
                 )
             fo = files[0]
+        self.of = fo
         self.fo = fo.__enter__()  # the whole instance is a context
         self.block_size = block_size
         self.dir_cache = None
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/local.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/local.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/local.py        
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/local.py        
2022-09-01 03:03:54.000000000 +0200
@@ -1,5 +1,6 @@
 import datetime
 import io
+import logging
 import os
 import os.path as osp
 import posixpath
@@ -13,6 +14,8 @@
 from fsspec.core import get_compression
 from fsspec.utils import isfilelike, stringify_path
 
+logger = logging.getLogger("fsspec.local")
+
 
 class LocalFileSystem(AbstractFileSystem):
     """Interface to files on local storage
@@ -118,7 +121,7 @@
         elif self.isdir(path1):
             self.mkdirs(path2, exist_ok=True)
         else:
-            raise FileNotFoundError
+            raise FileNotFoundError(path1)
 
     def get_file(self, path1, path2, callback=None, **kwargs):
         if isfilelike(path2):
@@ -158,7 +161,7 @@
             self.makedirs(self._parent(path), exist_ok=True)
         return LocalFileOpener(path, mode, fs=self, **kwargs)
 
-    def touch(self, path, **kwargs):
+    def touch(self, path, truncate=True, **kwargs):
         path = self._strip_protocol(path)
         if self.auto_mkdir:
             self.makedirs(self._parent(path), exist_ok=True)
@@ -166,6 +169,8 @@
             os.utime(path, None)
         else:
             open(path, "a").close()
+        if truncate:
+            os.truncate(path, 0)
 
     def created(self, path):
         info = self.info(path=path)
@@ -244,6 +249,7 @@
     def __init__(
         self, path, mode, autocommit=True, fs=None, compression=None, **kwargs
     ):
+        logger.debug("open file: %s", path)
         self.path = path
         self.mode = mode
         self.fs = fs
@@ -342,9 +348,8 @@
     def closed(self):
         return self.f.closed
 
-    def __fspath__(self):
-        # uniquely among fsspec implementations, this is a real, local path
-        return self.path
+    def fileno(self):
+        return self.raw.fileno()
 
     def __iter__(self):
         return self.f.__iter__()
@@ -354,7 +359,7 @@
 
     def __enter__(self):
         self._incontext = True
-        return self.f.__enter__()
+        return self
 
     def __exit__(self, exc_type, exc_value, traceback):
         self._incontext = False
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/memory.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/memory.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/memory.py       
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/memory.py       
2022-09-01 03:03:54.000000000 +0200
@@ -38,7 +38,7 @@
             return [
                 {
                     "name": path,
-                    "size": self.store[path].getbuffer().nbytes,
+                    "size": self.store[path].size,
                     "type": "file",
                     "created": self.store[path].created,
                 }
@@ -53,7 +53,7 @@
                     out.append(
                         {
                             "name": p2,
-                            "size": self.store[p2].getbuffer().nbytes,
+                            "size": self.store[p2].size,
                             "type": "file",
                             "created": self.store[p2].created,
                         }
@@ -96,7 +96,7 @@
     def mkdir(self, path, create_parents=True, **kwargs):
         path = self._strip_protocol(path)
         if path in self.store or path in self.pseudo_dirs:
-            raise FileExistsError
+            raise FileExistsError(path)
         if self._parent(path).strip("/") and self.isfile(self._parent(path)):
             raise NotADirectoryError(self._parent(path))
         if create_parents and self._parent(path).strip("/"):
@@ -114,6 +114,13 @@
             if not exist_ok:
                 raise
 
+    def pipe_file(self, path, value, **kwargs):
+        """Set the bytes of given file
+
+        Avoids copies of the data if possible
+        """
+        self.open(path, "wb", data=value)
+
     def rmdir(self, path):
         path = self._strip_protocol(path)
         if path == "":
@@ -145,9 +152,7 @@
             filelike = self.store[path]
             return {
                 "name": path,
-                "size": filelike.size
-                if hasattr(filelike, "size")
-                else filelike.getbuffer().nbytes,
+                "size": filelike.size,
                 "type": "file",
                 "created": getattr(filelike, "created", None),
             }
@@ -165,7 +170,7 @@
     ):
         path = self._strip_protocol(path)
         if path in self.pseudo_dirs:
-            raise IsADirectoryError
+            raise IsADirectoryError(path)
         parent = path
         while len(parent) > 1:
             parent = self._parent(parent)
@@ -184,7 +189,7 @@
             else:
                 raise FileNotFoundError(path)
         if mode == "wb":
-            m = MemoryFile(self, path)
+            m = MemoryFile(self, path, kwargs.get("data"))
             if not self._intrans:
                 m.commit()
             return m
@@ -193,17 +198,19 @@
         path1 = self._strip_protocol(path1)
         path2 = self._strip_protocol(path2)
         if self.isfile(path1):
-            self.store[path2] = MemoryFile(self, path2, 
self.store[path1].getbuffer())
+            self.store[path2] = MemoryFile(
+                self, path2, self.store[path1].getvalue()
+            )  # implicit copy
         elif self.isdir(path1):
             if path2 not in self.pseudo_dirs:
                 self.pseudo_dirs.append(path2)
         else:
-            raise FileNotFoundError
+            raise FileNotFoundError(path1)
 
     def cat_file(self, path, start=None, end=None, **kwargs):
         path = self._strip_protocol(path)
         try:
-            return self.store[path].getvalue()[start:end]
+            return bytes(self.store[path].getbuffer()[start:end])
         except KeyError:
             raise FileNotFoundError(path)
 
@@ -212,7 +219,7 @@
         try:
             del self.store[path]
         except KeyError as e:
-            raise FileNotFoundError from e
+            raise FileNotFoundError(path) from e
 
     def rm(self, path, recursive=False, maxdepth=None):
         if isinstance(path, str):
@@ -242,21 +249,23 @@
     """
 
     def __init__(self, fs=None, path=None, data=None):
+        logger.debug("open file %s", path)
         self.fs = fs
         self.path = path
         self.created = datetime.utcnow().timestamp()
         if data:
-            self.write(data)
-            self.size = len(data)
+            super().__init__(data)
             self.seek(0)
 
+    @property
+    def size(self):
+        return self.getbuffer().nbytes
+
     def __enter__(self):
         return self
 
     def close(self):
-        position = self.tell()
-        self.size = self.seek(0, 2)
-        self.seek(position)
+        pass
 
     def discard(self):
         pass
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/reference.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/reference.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/reference.py    
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/reference.py    
2022-09-01 03:03:54.000000000 +0200
@@ -490,7 +490,7 @@
             self._dircache_from_items()
         out = self._ls_from_cache(path)
         if out is None:
-            raise FileNotFoundError
+            raise FileNotFoundError(path)
         if detail:
             return out
         return [o["name"] for o in out]
@@ -531,10 +531,20 @@
             return r
 
     def info(self, path, **kwargs):
-        out = self.ls(path, True)
-        out0 = [o for o in out if o["name"] == path]
-        if not out0:
-            return {"name": path, "type": "directory", "size": 0}
+        if path in self.references:
+            out = self.references[path]
+            if isinstance(out, (str, bytes)):
+                # decode base64 here
+                return {"name": path, "type": "file", "size": len(out)}
+            elif len(out) > 1:
+                return {"name": path, "type": "file", "size": out[2]}
+            else:
+                out0 = [{"name": path, "type": "file", "size": None}]
+        else:
+            out = self.ls(path, True)
+            out0 = [o for o in out if o["name"] == path]
+            if not out0:
+                return {"name": path, "type": "directory", "size": 0}
         if out0[0]["size"] is None:
             # if this is a whole remote file, update size using remote FS
             prot, _ = split_protocol(self.references[path][0])
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/sftp.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/sftp.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/sftp.py 2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/sftp.py 2022-09-01 
03:03:54.000000000 +0200
@@ -64,9 +64,15 @@
         out.pop("protocol", None)
         return out
 
-    def mkdir(self, path, mode=511):
+    def mkdir(self, path, create_parents=False, mode=511):
         logger.debug("Creating folder %s" % path)
-        self.ftp.mkdir(path, mode)
+        if self.exists(path):
+            raise FileExistsError("File exists: {}".format(path))
+
+        if create_parents:
+            self.makedirs(path)
+        else:
+            self.ftp.mkdir(path, mode)
 
     def makedirs(self, path, exist_ok=False, mode=511):
         if self.exists(path) and not exist_ok:
@@ -78,7 +84,7 @@
         for part in parts:
             path += "/" + part
             if not self.exists(path):
-                self.mkdir(path, mode)
+                self.ftp.mkdir(path, mode)
 
     def rmdir(self, path):
         logger.debug("Removing folder %s" % path)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/tar.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/tar.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/tar.py  2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/tar.py  2022-09-01 
03:03:54.000000000 +0200
@@ -1,7 +1,6 @@
 import copy
 import logging
 import tarfile
-import weakref
 from io import BufferedReader
 
 import fsspec
@@ -38,7 +37,8 @@
         target_options = target_options or {}
 
         if isinstance(fo, str):
-            fo = fsspec.open(fo, protocol=target_protocol, 
**target_options).open()
+            self.of = fsspec.open(fo, protocol=target_protocol, 
**target_options)
+            fo = self.of.open()  # keep the reference
 
         # Try to infer compression.
         if compression is None:
@@ -82,8 +82,7 @@
             fo = compr[compression](fo)
 
         self._fo_ref = fo
-        weakref.finalize(self, fo.close)
-        self.fo = fo.__enter__()  # the whole instance is a context
+        self.fo = fo  # the whole instance is a context
         self.tar = tarfile.TarFile(fileobj=self.fo)
         self.dir_cache = None
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_arrow.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_arrow.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_arrow.py     
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_arrow.py     
2022-09-01 03:03:54.000000000 +0200
@@ -108,7 +108,7 @@
     fs.touch(remote_dir + "/dir/a")
     fs.touch(remote_dir + "/dir/b")
     fs.mkdir(remote_dir + "/dir/c/")
-    fs.touch(remote_dir + "/dir/c/a/")
+    fs.touch(remote_dir + "/dir/c/a")
     fs.rm(remote_dir + "/dir", recursive=True)
     assert not fs.exists(remote_dir + "/dir")
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_cached.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_cached.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_cached.py    
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_cached.py    
2022-09-01 03:03:54.000000000 +0200
@@ -8,7 +8,7 @@
 import fsspec
 from fsspec.compression import compr
 from fsspec.exceptions import BlocksizeMismatchError
-from fsspec.implementations.cached import CachingFileSystem
+from fsspec.implementations.cached import CachingFileSystem, LocalTempFile
 
 from .test_ftp import FTPFileSystem
 
@@ -191,13 +191,15 @@
     tmp = str(tempfile.mkdtemp())
     fn = tmp + "afile"
     url = "simplecache::file://" + fn
-    f = fsspec.open(url, "wb").open()
-    f.write(b"hello ")
-    f.flush()
-    with pickle.loads(pickle.dumps(f)) as f2:
-        f2.write(b"world")
+    with fsspec.open(url, "wb") as f:
+        pickle.loads(pickle.dumps(f))
+        f.write(b"hello ")
+        pickle.dumps(f)
+
+    with pytest.raises(ValueError):
+        pickle.dumps(f)
 
-    assert open(fn, "rb").read() == b"hello world"
+    assert open(fn, "rb").read() == b"hello "
 
 
 def test_blocksize(ftp_writable):
@@ -724,11 +726,15 @@
 @pytest.mark.parametrize("protocol", ["simplecache", "filecache"])
 def test_cached_write(protocol):
     d = tempfile.mkdtemp()
-    with fsspec.open_files(f"{protocol}::file://{d}/*.out", mode="wb", num=2) 
as files:
+    ofs = fsspec.open_files(f"{protocol}::file://{d}/*.out", mode="wb", num=2)
+    with ofs as files:
         for f in files:
+            assert isinstance(f, LocalTempFile)
             f.write(b"data")
+            fn = f.name
 
     assert sorted(os.listdir(d)) == ["0.out", "1.out"]
+    assert not os.path.exists(fn)
 
 
 def test_expiry():
@@ -806,18 +812,10 @@
     assert hash(cfs2) == hash(cfs3)
 
 
-@pytest.mark.xfail
-def test_json():
-    """Test that the JSON representation refers to correct class.
-
-    Make sure that the JSON representation of a CachingFileSystem refers to the
-    CachingFileSystem, not to the underlying filesystem.
-    """
-    import json
-
+def test_str():
+    """Test that the str representation refers to correct class."""
     from fsspec.implementations.local import LocalFileSystem
 
     lfs = LocalFileSystem()
-    cfs = CachingFileSystem(fs=lfs, cache_storage="raspberry")
-    D = json.loads(cfs.to_json())
-    assert D["cls"].endswith("CachingFileSystem")
+    cfs = CachingFileSystem(fs=lfs)
+    assert "CachingFileSystem" in str(cfs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_http.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_http.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_http.py      
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_http.py      
2022-09-01 03:03:54.000000000 +0200
@@ -163,6 +163,11 @@
     # via HTTPFile
     h = fsspec.filesystem("http", headers={"give_length": "true", "head_ok": 
"true"})
     out = server + "/index/realfile"
+
+    with fsspec.open(out, headers={"give_length": "true", "head_ok": "true"}) 
as f:
+        pic = pickle.loads(pickle.dumps(f))
+        assert pic.read() == data
+
     with h.open(out, "rb") as f:
         pic = pickle.dumps(f)
         assert f.read() == data
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_local.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_local.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_local.py     
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_local.py     
2022-09-01 03:03:54.000000000 +0200
@@ -342,6 +342,18 @@
         assert info2["mtime"] > info["mtime"]
 
 
+def test_touch_truncate(tmpdir):
+    fn = str(tmpdir + "/tfile")
+    fs = fsspec.filesystem("file")
+    fs.touch(fn, truncate=True)
+    fs.pipe(fn, b"a")
+    fs.touch(fn, truncate=True)
+    assert fs.cat(fn) == b""
+    fs.pipe(fn, b"a")
+    fs.touch(fn, truncate=False)
+    assert fs.cat(fn) == b"a"
+
+
 def test_directories(tmpdir):
     tmpdir = make_path_posix(str(tmpdir))
     fs = LocalFileSystem()
@@ -587,6 +599,20 @@
     with pytest.raises(ValueError):
         pickle.dumps(f)
 
+    # with context
+    with fs.open(fn0, "rb") as f:
+        f.seek(1)
+        f2 = pickle.loads(pickle.dumps(f))
+        assert f2.tell() == 1
+        assert f2.read() == f.read()
+
+    # with fsspec.open https://github.com/fsspec/filesystem_spec/issues/579
+    with fsspec.open(fn0, "rb") as f:
+        f.seek(1)
+        f2 = pickle.loads(pickle.dumps(f))
+        assert f2.tell() == 1
+        assert f2.read() == f.read()
+
 
 def test_strip_protocol_expanduser():
     path = "file://~\\foo\\bar" if WIN else "file://~/foo/bar"
@@ -749,3 +775,13 @@
     f.seek(1)
     assert f.read(1) == "a"
     assert f.tell() == 2
+
+
+def test_numpy_fromfile(tmpdir):
+    # Regression test for #1005.
+    np = pytest.importorskip("numpy")
+    fn = str(tmpdir / "test_arr.npy")
+    dt = np.int64
+    arr = np.arange(10, dtype=dt)
+    arr.tofile(fn)
+    assert np.array_equal(np.fromfile(fn, dtype=dt), arr)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_sftp.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_sftp.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_sftp.py      
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_sftp.py      
2022-09-01 03:03:54.000000000 +0200
@@ -162,6 +162,21 @@
         f.rm(root_path, recursive=True)
 
 
+def test_mkdir_create_parent(ssh):
+    f = fsspec.get_filesystem_class("sftp")(**ssh)
+
+    with pytest.raises(FileNotFoundError):
+        f.mkdir("/a/b/c")
+
+    f.mkdir("/a/b/c", create_parents=True)
+    assert f.exists("/a/b/c")
+
+    with pytest.raises(FileExistsError, match="/a/b/c"):
+        f.mkdir("/a/b/c")
+
+    f.rm("/a/b/c", recursive=True)
+
+
 def test_makedirs_exist_ok(ssh):
     f = fsspec.get_filesystem_class("sftp")(**ssh)
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_zip.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_zip.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/tests/test_zip.py       
2022-05-19 20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/tests/test_zip.py       
2022-09-01 03:03:54.000000000 +0200
@@ -39,3 +39,21 @@
         fs = fsspec.filesystem("zip", fo=z)
         fs2 = fsspec.filesystem("zip", fo=z)
         assert fs is not fs2
+
+
+def test_root_info():
+    with tempzip(archive_data) as z:
+        fs = fsspec.filesystem("zip", fo=z)
+        assert fs.info("/") == {"name": "/", "type": "directory", "size": 0}
+        assert fs.info("") == {"name": "/", "type": "directory", "size": 0}
+
+
+def test_write_seek(m):
+    with m.open("afile.zip", "wb") as f:
+        fs = fsspec.filesystem("zip", fo=f, mode="w")
+        fs.pipe("another", b"hi")
+        fs.zip.close()
+
+    with m.open("afile.zip", "rb") as f:
+        fs = fsspec.filesystem("zip", fo=f)
+        assert fs.cat("another") == b"hi"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/filesystem_spec-2022.5.0/fsspec/implementations/zip.py 
new/filesystem_spec-2022.8.2/fsspec/implementations/zip.py
--- old/filesystem_spec-2022.5.0/fsspec/implementations/zip.py  2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/implementations/zip.py  2022-09-01 
03:03:54.000000000 +0200
@@ -1,5 +1,6 @@
 from __future__ import absolute_import, division, print_function
 
+import datetime
 import zipfile
 
 from fsspec import open_files
@@ -44,18 +45,19 @@
             a string.
         """
         super().__init__(self, **kwargs)
-        if mode != "r":
-            raise ValueError("Only read from zip files accepted")
         if isinstance(fo, str):
-            files = open_files(fo, protocol=target_protocol, **(target_options 
or {}))
+            files = open_files(
+                fo, mode=mode + "b", protocol=target_protocol, 
**(target_options or {})
+            )
             if len(files) != 1:
                 raise ValueError(
                     'Path "{}" did not resolve to exactly'
                     'one file: "{}"'.format(fo, files)
                 )
             fo = files[0]
+        self.of = fo
         self.fo = fo.__enter__()  # the whole instance is a context
-        self.zip = zipfile.ZipFile(self.fo)
+        self.zip = zipfile.ZipFile(self.fo, mode=mode)
         self.block_size = block_size
         self.dir_cache = None
 
@@ -64,6 +66,9 @@
         # zip file paths are always relative to the archive root
         return super()._strip_protocol(path).lstrip("/")
 
+    def __del__(self):
+        self.zip.close()
+
     def _get_dirs(self):
         if self.dir_cache is None:
             files = self.zip.infolist()
@@ -82,6 +87,13 @@
                 )
                 self.dir_cache[f["name"]] = f
 
+    def pipe_file(self, path, value, **kwargs):
+        # override upstream, because we know the exact file size in this case
+        info = zipfile.ZipInfo(path, datetime.datetime.now().timetuple())
+        info.file_size = len(value)
+        with self.zip.open(path, "w") as f:
+            f.write(value)
+
     def _open(
         self,
         path,
@@ -92,10 +104,9 @@
         **kwargs,
     ):
         path = self._strip_protocol(path)
-        if mode != "rb":
-            raise NotImplementedError
-        info = self.info(path)
-        out = self.zip.open(path, "r")
-        out.size = info["size"]
-        out.name = info["name"]
+        out = self.zip.open(path, mode.strip("b"))
+        if "r" in mode:
+            info = self.info(path)
+            out.size = info["size"]
+            out.name = info["name"]
         return out
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/registry.py 
new/filesystem_spec-2022.8.2/fsspec/registry.py
--- old/filesystem_spec-2022.5.0/fsspec/registry.py     2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/registry.py     2022-09-01 
03:03:54.000000000 +0200
@@ -202,6 +202,12 @@
         "class": "webdav4.fsspec.WebdavFileSystem",
         "err": "Install webdav4 to access WebDAV",
     },
+    "root": {
+        "class": "fsspec_xrootd.XRootDFileSystem",
+        "err": "Install fsspec-xrootd to access xrootd storage system."
+        + " Note: 'root' is the protocol name for xrootd storage systems,"
+        + " not refering to root directories",
+    },
 }
 
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/spec.py 
new/filesystem_spec-2022.8.2/fsspec/spec.py
--- old/filesystem_spec-2022.5.0/fsspec/spec.py 2022-05-19 20:13:38.000000000 
+0200
+++ new/filesystem_spec-2022.8.2/fsspec/spec.py 2022-09-01 03:03:54.000000000 
+0200
@@ -187,14 +187,11 @@
 
     def unstrip_protocol(self, name):
         """Format FS-specific path to generic, including protocol"""
-        if isinstance(self.protocol, str):
-            if name.startswith(self.protocol):
+        protos = (self.protocol,) if isinstance(self.protocol, str) else 
self.protocol
+        for protocol in protos:
+            if name.startswith(f"{protocol}://"):
                 return name
-            return self.protocol + "://" + name
-        else:
-            if name.startswith(tuple(self.protocol)):
-                return name
-            return self.protocol[0] + "://" + name
+        return f"{protos[0]}://{name}"
 
     @staticmethod
     def _get_kwargs_from_urls(path):
@@ -679,7 +676,7 @@
 
     def pipe_file(self, path, value, **kwargs):
         """Set the bytes of given file"""
-        with self.open(path, "wb") as f:
+        with self.open(path, "wb", **kwargs) as f:
             f.write(value)
 
     def pipe(self, path, value=None, **kwargs):
@@ -952,7 +949,7 @@
 
     @classmethod
     def _parent(cls, path):
-        path = cls._strip_protocol(path.rstrip("/"))
+        path = cls._strip_protocol(path)
         if "/" in path:
             parent = path.rsplit("/", 1)[0].lstrip(cls.root_marker)
             return cls.root_marker + parent
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/tests/test_core.py 
new/filesystem_spec-2022.8.2/fsspec/tests/test_core.py
--- old/filesystem_spec-2022.5.0/fsspec/tests/test_core.py      2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/tests/test_core.py      2022-09-01 
03:03:54.000000000 +0200
@@ -83,7 +83,7 @@
     assert f.read() == b"data"
     f.close()
     with OpenFile(m, "somepath", mode="rt") as f:
-        f.read() == "data"
+        assert f.read() == "data"
 
 
 def test_openfile_open(m):
@@ -91,9 +91,7 @@
     f = of.open()
     f.write("hello")
     assert m.size("somepath") == 0  # no flush yet
-    del of
-    assert m.size("somepath") == 0  # still no flush
-    f.close()
+    of.close()
     assert m.size("somepath") == 5
 
 
@@ -183,6 +181,18 @@
     assert test.newline == restored.newline
 
 
+def test_pickle_after_open_open():
+    of = fsspec.open(__file__, mode="rt")
+    test = of.open()
+    of2 = pickle.loads(pickle.dumps(of))
+    test2 = of2.open()
+    test.close()
+
+    assert not test2.closed
+    of.close()
+    of2.close()
+
+
 def test_mismatch():
     with pytest.raises(ValueError, match="protocol"):
         open_files(["s3://test/path.csv", "/other/path.csv"])
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/tests/test_utils.py 
new/filesystem_spec-2022.8.2/fsspec/tests/test_utils.py
--- old/filesystem_spec-2022.5.0/fsspec/tests/test_utils.py     2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/tests/test_utils.py     2022-09-01 
03:03:54.000000000 +0200
@@ -4,6 +4,7 @@
 
 import pytest
 
+import fsspec.utils
 from fsspec.utils import (
     can_be_local,
     common_prefix,
@@ -377,3 +378,9 @@
     assert expect_paths == result_paths
     assert expect_starts == result_starts
     assert expect_ends == result_ends
+
+
+def test_size():
+    f = io.BytesIO(b"hello")
+    assert fsspec.utils.file_size(f) == 5
+    assert f.tell() == 0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/fsspec/utils.py 
new/filesystem_spec-2022.8.2/fsspec/utils.py
--- old/filesystem_spec-2022.5.0/fsspec/utils.py        2022-05-19 
20:13:38.000000000 +0200
+++ new/filesystem_spec-2022.8.2/fsspec/utils.py        2022-09-01 
03:03:54.000000000 +0200
@@ -7,8 +7,6 @@
 from contextlib import contextmanager
 from functools import partial
 from hashlib import md5
-from types import TracebackType
-from typing import IO, AnyStr, Callable, Iterable, Iterator, List, Optional, 
Type
 from urllib.parse import urlsplit
 
 DEFAULT_BLOCK_SIZE = 5 * 2**20
@@ -544,78 +542,10 @@
     return paths, starts, ends
 
 
-class IOWrapper(IO):
-    """Wrapper for a file-like object that can be used in situations where we 
might
-    want to, e.g., monkey-patch the close method but can't.
-    (cf https://github.com/fsspec/filesystem_spec/issues/725)
-    """
-
-    def __init__(self, fp: IO, closer: Callable[[], None]):
-        self.fp = fp
-        self.closer = closer
-
-    def close(self) -> None:
-        self.fp.close()
-
-    def fileno(self) -> int:
-        return self.fp.fileno()
-
-    def flush(self) -> None:
-        self.fp.flush()
-
-    def isatty(self) -> bool:
-        return self.fp.isatty()
-
-    def read(self, n: int = ...) -> AnyStr:
-        return self.fp.read(n)
-
-    def readable(self) -> bool:
-        return self.fp.readable()
-
-    def readline(self, limit: int = ...) -> AnyStr:
-        return self.fp.readline(limit)
-
-    def readlines(self, hint: int = ...) -> List[AnyStr]:
-        return self.fp.readlines(hint)
-
-    def seek(self, offset: int, whence: int = ...) -> int:
-        return self.fp.seek(offset, whence)
-
-    def seekable(self) -> bool:
-        return self.fp.seekable()
-
-    def tell(self) -> int:
-        return self.fp.tell()
-
-    def truncate(self, size: Optional[int] = ...) -> int:
-        return self.fp.truncate(size)
-
-    def writable(self) -> bool:
-        return self.fp.writable()
-
-    def write(self, s: AnyStr) -> int:
-        return self.fp.write(s)
-
-    def writelines(self, lines: Iterable[AnyStr]) -> None:
-        self.fp.writelines(lines)
-
-    def __next__(self) -> AnyStr:
-        return next(self.fp)
-
-    def __iter__(self) -> Iterator[AnyStr]:
-        return iter(self.fp)
-
-    def __enter__(self) -> IO[AnyStr]:
-        return self.fp.__enter__()
-
-    def __exit__(
-        self,
-        t: Optional[Type[BaseException]],
-        value: Optional[BaseException],
-        traceback: Optional[TracebackType],
-    ) -> Optional[bool]:
-        return self.fp.__exit__(t, value, traceback)
-
-    # forward anything else too
-    def __getattr__(self, name):
-        return getattr(self.fp, name)
+def file_size(filelike):
+    """Find length of any open read-mode file-like"""
+    pos = filelike.tell()
+    try:
+        return filelike.seek(0, 2)
+    finally:
+        filelike.seek(pos)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/setup.py 
new/filesystem_spec-2022.8.2/setup.py
--- old/filesystem_spec-2022.5.0/setup.py       2022-05-19 20:13:38.000000000 
+0200
+++ new/filesystem_spec-2022.8.2/setup.py       2022-09-01 03:03:54.000000000 
+0200
@@ -27,6 +27,10 @@
     long_description=long_description,
     long_description_content_type="text/markdown",
     url="http://github.com/fsspec/filesystem_spec";,
+    project_urls={
+        "Changelog": 
"https://filesystem-spec.readthedocs.io/en/latest/changelog.html";,
+        "Documentation": "https://filesystem-spec.readthedocs.io/en/latest/";,
+    },
     maintainer="Martin Durant",
     maintainer_email="mdur...@anaconda.com",
     license="BSD",
@@ -46,7 +50,7 @@
         "gs": ["gcsfs"],
         "hdfs": ["pyarrow >= 1"],
         "arrow": ["pyarrow >= 1"],
-        "http": ["requests", "aiohttp"],
+        "http": ["requests", "aiohttp !=4.0.0a0, !=4.0.0a1"],
         "sftp": ["paramiko"],
         "s3": ["s3fs"],
         "oci": ["ocifs"],
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/filesystem_spec-2022.5.0/tox.ini 
new/filesystem_spec-2022.8.2/tox.ini
--- old/filesystem_spec-2022.5.0/tox.ini        2022-05-19 20:13:38.000000000 
+0200
+++ new/filesystem_spec-2022.8.2/tox.ini        2022-09-01 03:03:54.000000000 
+0200
@@ -1,6 +1,6 @@
 # content of: tox.ini , put in same dir as setup.py
 [tox]
-envlist = {py37,py38,py39}
+envlist = {py38,py39,py310}
 
 [core]
 conda_channels=
@@ -41,7 +41,6 @@
 deps=
     hadoop-test-cluster==0.1.0
     smbprotocol
-    py37: importlib_metadata
 
 [testenv]
 description=Run test suite against target versions.
@@ -84,22 +83,26 @@
 description=Run gcsfs (@master) test suite against fsspec.
 extras=gcs
 conda_channels=
-    defaults
     conda-forge
+    defaults
 conda_deps=
-    {[core]conda_deps}
-deps=
-    {[core]deps}
-    vcrpy
+    pytest
     ujson
+    requests
+    decorator
+    google-auth
+    aiohttp
     google-auth-oauthlib
-    crcmod
+    flake8
+    black
+    google-cloud-core
+    google-api-core
+    google-api-python-client
 changedir=.tox/gcsfs/tmp
 whitelist_externals=
     rm
     git
 setenv=
-    GCSFS_RECORD_MODE=none
     GOOGLE_APPLICATION_CREDENTIALS=gcsfs/gcsfs/tests/fake-secret.json
 commands=
     rm -rf gcsfs

Reply via email to