Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-devpi-server for 
openSUSE:Factory checked in at 2026-05-04 12:54:30
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-devpi-server (Old)
 and      /work/SRC/openSUSE:Factory/.python-devpi-server.new.30200 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-devpi-server"

Mon May  4 12:54:30 2026 rev:20 rq:1350559 version:6.20.0

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-devpi-server/python-devpi-server.changes  
2026-04-20 16:11:10.684858384 +0200
+++ 
/work/SRC/openSUSE:Factory/.python-devpi-server.new.30200/python-devpi-server.changes
       2026-05-04 12:58:02.320645210 +0200
@@ -1,0 +2,13 @@
+Sun May  3 17:55:54 UTC 2026 - Dirk Müller <[email protected]>
+
+- update to 6.20.0:
+  * Add experimental bare bones core-metadata ([PEP
+    658](https://peps.python.org/pep-0658/), [PEP
+    714](https://peps.python.org/pep-0714/)) support with
+    ``--enable-core-metadata`` command line option and
+    ``mirror_provides_core_metadata`` mirror index option. Refs
+    #1018
+  * Update replica status when the replica is waiting for new
+    serials using the streaming changelog endpoint.
+
+-------------------------------------------------------------------

Old:
----
  devpi_server-6.19.3.tar.gz

New:
----
  devpi_server-6.20.0.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-devpi-server.spec ++++++
--- /var/tmp/diff_new_pack.UvIogv/_old  2026-05-04 12:58:02.828666118 +0200
+++ /var/tmp/diff_new_pack.UvIogv/_new  2026-05-04 12:58:02.832666282 +0200
@@ -26,7 +26,7 @@
 
 %{?sle15_python_module_pythons}
 Name:           python-devpi-server
-Version:        6.19.3
+Version:        6.20.0
 Release:        0
 Summary:        Private PyPI caching server
 License:        MIT

++++++ devpi_server-6.19.3.tar.gz -> devpi_server-6.20.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/CHANGELOG 
new/devpi_server-6.20.0/CHANGELOG
--- old/devpi_server-6.19.3/CHANGELOG   2026-04-13 17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/CHANGELOG   2026-04-30 11:01:49.000000000 +0200
@@ -2,6 +2,20 @@
 
 .. towncrier release notes start
 
+6.20.0 (2026-04-30)
+===================
+
+Features
+--------
+
+- Add experimental bare bones core-metadata ([PEP 
658](https://peps.python.org/pep-0658/), [PEP 
714](https://peps.python.org/pep-0714/)) support with 
``--enable-core-metadata`` command line option and 
``mirror_provides_core_metadata`` mirror index option. Refs #1018
+
+Bug Fixes
+---------
+
+- Update replica status when the replica is waiting for new serials using the 
streaming changelog endpoint.
+
+
 6.19.3 (2026-04-13)
 ===================
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/CHANGELOG.short.rst 
new/devpi_server-6.20.0/CHANGELOG.short.rst
--- old/devpi_server-6.19.3/CHANGELOG.short.rst 2026-04-13 17:20:53.000000000 
+0200
+++ new/devpi_server-6.20.0/CHANGELOG.short.rst 2026-04-30 11:02:47.000000000 
+0200
@@ -9,6 +9,20 @@
 
 .. towncrier release notes start
 
+6.20.0 (2026-04-30)
+===================
+
+Features
+--------
+
+- Add experimental bare bones core-metadata ([PEP 
658](https://peps.python.org/pep-0658/), [PEP 
714](https://peps.python.org/pep-0714/)) support with 
``--enable-core-metadata`` command line option and 
``mirror_provides_core_metadata`` mirror index option. Refs #1018
+
+Bug Fixes
+---------
+
+- Update replica status when the replica is waiting for new serials using the 
streaming changelog endpoint.
+
+
 6.19.3 (2026-04-13)
 ===================
 
@@ -81,34 +95,3 @@
 
 - Fix #1110: a list for the ``listen`` option in a config file stopped working 
in 6.18.0.
 
-
-6.18.0 (2026-01-27)
-===================
-
-Features
---------
-
-- Store all available hashes of files.
-
-- Validate hashes of all files during devpi-import, not only releases.
-
-Bug Fixes
----------
-
-- Apply argparse transformations on values read from config file or 
environment.
-
-- Restore Python and platform info in user agent string after switch to httpx.
-
-- Remove all database entries on project deletion instead of only emptying 
them.
-
-- Fix error at end of replica streaming caused by changed behavior from switch 
to httpx.
-
-- Fix #1102: The data stream was cut off after 64k when proxying from replica 
to primary after switching to httpx.
-
-- Fix #1107: retry file downloads if there has been an error during download.
-
-Other Changes
--------------
-
-- The filenames of some exported doczip files change due to normalization of 
the project name caused by changing the internals during export to allow 
``--hard-links`` to work.
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/PKG-INFO 
new/devpi_server-6.20.0/PKG-INFO
--- old/devpi_server-6.19.3/PKG-INFO    2026-04-13 17:20:53.670446200 +0200
+++ new/devpi_server-6.20.0/PKG-INFO    2026-04-30 11:02:47.852710000 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 2.4
 Name: devpi-server
-Version: 6.19.3
+Version: 6.20.0
 Summary: devpi-server: backend for hosting private package indexes and PyPI 
on-demand mirrors
 Maintainer-email: Florian Schulze <[email protected]>
 License-Expression: MIT
@@ -121,6 +121,20 @@
 
 .. towncrier release notes start
 
+6.20.0 (2026-04-30)
+===================
+
+Features
+--------
+
+- Add experimental bare bones core-metadata ([PEP 
658](https://peps.python.org/pep-0658/), [PEP 
714](https://peps.python.org/pep-0714/)) support with 
``--enable-core-metadata`` command line option and 
``mirror_provides_core_metadata`` mirror index option. Refs #1018
+
+Bug Fixes
+---------
+
+- Update replica status when the replica is waiting for new serials using the 
streaming changelog endpoint.
+
+
 6.19.3 (2026-04-13)
 ===================
 
@@ -193,34 +207,3 @@
 
 - Fix #1110: a list for the ``listen`` option in a config file stopped working 
in 6.18.0.
 
-
-6.18.0 (2026-01-27)
-===================
-
-Features
---------
-
-- Store all available hashes of files.
-
-- Validate hashes of all files during devpi-import, not only releases.
-
-Bug Fixes
----------
-
-- Apply argparse transformations on values read from config file or 
environment.
-
-- Restore Python and platform info in user agent string after switch to httpx.
-
-- Remove all database entries on project deletion instead of only emptying 
them.
-
-- Fix error at end of replica streaming caused by changed behavior from switch 
to httpx.
-
-- Fix #1102: The data stream was cut off after 64k when proxying from replica 
to primary after switching to httpx.
-
-- Fix #1107: retry file downloads if there has been an error during download.
-
-Other Changes
--------------
-
-- The filenames of some exported doczip files change due to normalization of 
the project name caused by changing the internals during export to allow 
``--hard-links`` to work.
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/__init__.py 
new/devpi_server-6.20.0/devpi_server/__init__.py
--- old/devpi_server-6.19.3/devpi_server/__init__.py    2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/__init__.py    2026-04-30 
11:01:49.000000000 +0200
@@ -1 +1 @@
-__version__ = "6.19.3"
+__version__ = "6.20.0"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/config.py 
new/devpi_server-6.20.0/devpi_server/config.py
--- old/devpi_server-6.19.3/devpi_server/config.py      2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/config.py      2026-04-30 
11:01:49.000000000 +0200
@@ -246,6 +246,12 @@
              "This will become the default at some point.")
 
     parser.addoption(
+        "--enable-core-metadata",
+        action="store_true",
+        help="(experimental) Enable minimal core-metadata support in simple 
API.",
+    )
+
+    parser.addoption(
         "--profile-requests", type=int, metavar="NUM", default=0,
         help="profile NUM requests and print out cumulative stats. "
              "After print profiling is restarted. "
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/filestore.py 
new/devpi_server-6.20.0/devpi_server/filestore.py
--- old/devpi_server-6.19.3/devpi_server/filestore.py   2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/filestore.py   2026-04-30 
11:01:49.000000000 +0200
@@ -700,7 +700,7 @@
         headers = {}
         headers["last-modified"] = str(self.last_modified)
         m = mimetypes.guess_type(self.basename)[0]
-        headers["content-type"] = str(m)
+        headers["content-type"] = "application/octet-stream" if m is None else 
m
         headers["content-length"] = str(self.file_size())
         headers["cache-control"] = "max-age=365000000, immutable, public"
         return headers
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/keyfs.py 
new/devpi_server-6.20.0/devpi_server/keyfs.py
--- old/devpi_server-6.19.3/devpi_server/keyfs.py       2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/keyfs.py       2026-04-30 
11:01:49.000000000 +0200
@@ -386,7 +386,10 @@
                 relpaths: Iterable[RelPath],
             ) -> Iterable[FilePathInfo]:
                 for relpath in relpaths:
-                    (_, _, val) = conn.get_relpath_at(relpath, serial)
+                    (_, back_serial, val) = conn.get_relpath_at(relpath, 
serial)
+                    if val is None:
+                        # the file was deleted, get the data from before
+                        (_, _, val) = conn.get_relpath_at(relpath, back_serial)
                     if (
                         isinstance(val, (dict, DictViewReadonly))
                         and "hash_spec" in val
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/keyfs_sqlite.py 
new/devpi_server-6.20.0/devpi_server/keyfs_sqlite.py
--- old/devpi_server-6.19.3/devpi_server/keyfs_sqlite.py        2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/keyfs_sqlite.py        2026-04-30 
11:01:49.000000000 +0200
@@ -464,6 +464,7 @@
         self._execute_conn_pragmas(sqlconn)
         if write:
             start_time = time.monotonic()
+            log_delay = 2
             thread = current_thread()
             while 1:
                 try:
@@ -475,6 +476,11 @@
                     if hasattr(thread, "exit_if_shutdown"):
                         thread.exit_if_shutdown()
                     elapsed = time.monotonic() - start_time
+                    if elapsed >= log_delay:
+                        threadlog.warn(
+                            "Waiting on database connection for %s seconds", 
log_delay
+                        )
+                        log_delay = log_delay * 1.5
                     if elapsed > timeout:
                         # if it takes this long, something is wrong
                         raise KeyfsTimeoutError(
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/mirror.py 
new/devpi_server-6.20.0/devpi_server/mirror.py
--- old/devpi_server-6.19.3/devpi_server/mirror.py      2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/mirror.py      2026-04-30 
11:01:49.000000000 +0200
@@ -486,6 +486,10 @@
         return None
 
     @property
+    def provides_core_metadata(self) -> bool:
+        return self.ixconfig.get("mirror_provides_core_metadata", False)
+
+    @property
     def no_project_list(self) -> bool:
         return self.ixconfig.get('mirror_no_project_list', False)
 
@@ -501,6 +505,7 @@
             "mirror_cache_expiry",
             "mirror_ignore_serial_header",
             "mirror_no_project_list",
+            "mirror_provides_core_metadata",
             "mirror_url",
             "mirror_use_external_urls",
             "mirror_web_url_fmt",
@@ -529,6 +534,8 @@
             return ensure_boolean(value)
         if key == "mirror_no_project_list":
             return ensure_boolean(value)
+        if key == "mirror_provides_core_metadata":
+            return ensure_boolean(value)
         if key == "mirror_use_external_urls":
             return ensure_boolean(value)
         if key in ("custom_data", "description", "mirror_web_url_fmt", 
"title"):
@@ -747,6 +754,9 @@
             return (self.cache_projectnames.get(), False)
         lock = self._list_projects_perstage_lock
         projects_timeout = self.get_projects_timeout(timeout)
+        threadlog.debug(
+            "Acquiring projects list lock (%r) with timeout %s", lock, timeout
+        )
         if lock.acquire(timeout=projects_timeout):
             try:
                 # retry in case it was updated in another thread
@@ -756,6 +766,7 @@
                 return self._update_projects(timeout=timeout)
             finally:
                 lock.release()
+                threadlog.debug("Released projects list lock (%r)", lock)
         return (self._stale_list_projects_perstage(), True)
 
     def list_projects_perstage(self) -> dict[str, NormalizedName | str]:
@@ -968,7 +979,9 @@
                 self.keyfs.tx.on_commit_success(
                     partial(self.cache_retrieve_times.refresh, project, 
info.etag)
                 )
-                return self.SimpleLinks(links)
+                return self.SimpleLinks(
+                    links, core_metadata=self.provides_core_metadata
+                )
             raise self.UpstreamError("no cache links from primary for %s" %
                                      project)
 
@@ -992,7 +1005,7 @@
                 info.serial,
                 info.etag,
             )
-            return self.SimpleLinks(newlinks)
+            return self.SimpleLinks(newlinks, 
core_metadata=self.provides_core_metadata)
 
     async def _update_simplelinks_in_future(
         self,
@@ -1038,7 +1051,9 @@
                 threadlog.warn(
                     "serving stale links for %r, waiting for existing request 
timed out after %s seconds",
                     project, self.timeout)
-                return self.SimpleLinks(links, stale=True)
+                return self.SimpleLinks(
+                    links, core_metadata=self.provides_core_metadata, 
stale=True
+                )
             raise self.UpstreamError(
                 f"timeout after {self.timeout} seconds while getting data for 
{project!r}")
 
@@ -1049,7 +1064,9 @@
                 threadlog.debug(
                     "using stale links for %r due to offline mode", project)
                 self._offline_logging.add(project)
-            return self.SimpleLinks(links, stale=True)
+            return self.SimpleLinks(
+                links, core_metadata=self.provides_core_metadata, stale=True
+            )
 
         if links is None:
             is_retrieval_expired = self.cache_retrieve_times.is_expired(
@@ -1090,14 +1107,18 @@
                 threadlog.warn(
                     "serving stale links for %r, getting data timed out after 
%s seconds",
                     project, self.timeout)
-                return self.SimpleLinks(links, stale=True)
+                return self.SimpleLinks(
+                    links, core_metadata=self.provides_core_metadata, 
stale=True
+                )
             raise self.UpstreamError(
                 f"timeout after {self.timeout} seconds while getting data for 
{project!r}")
         except self.UpstreamNotModified as e:
             if links is not None:
                 # immediately update the cache
                 self.cache_retrieve_times.refresh(project, e.etag)
-                return self.SimpleLinks(links)
+                return self.SimpleLinks(
+                    links, core_metadata=self.provides_core_metadata
+                )
             if e.etag is None:
                 threadlog.error(
                     "server returned 304 Not Modified, but we have no links")
@@ -1113,7 +1134,9 @@
                 threadlog.warn(
                     "serving stale links, because of exception %s",
                     lazy_format_exception(e))
-                return self.SimpleLinks(links, stale=True)
+                return self.SimpleLinks(
+                    links, core_metadata=self.provides_core_metadata, 
stale=True
+                )
             raise
 
         info = newlinks_future.result()
@@ -1122,7 +1145,7 @@
         if links is not None and set(links) == set(newlinks):
             # no changes
             self.cache_retrieve_times.refresh(project, info.etag)
-            return self.SimpleLinks(links)
+            return self.SimpleLinks(links, 
core_metadata=self.provides_core_metadata)
 
         return self._update_simplelinks(project, info, links, newlinks)
 
@@ -1300,7 +1323,12 @@
         self.project = project
 
     def acquire(self, timeout: float) -> bool:
-        threadlog.debug("Acquiring lock (%r) for %r", self.lock, self.project)
+        threadlog.debug(
+            "Acquiring lock (%r) for %r with timeout %s",
+            self.lock,
+            self.project,
+            timeout,
+        )
         assert self.lock is not None
         return self.lock.acquire(timeout=timeout)
 
@@ -1317,6 +1345,7 @@
     def release(self) -> None:
         if self.lock is not None:
             self.lock.release()
+            threadlog.debug("Released lock (%r) for %r", self.lock, 
self.project)
             self.lock = None
 
     def __repr__(self) -> str:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/model.py 
new/devpi_server-6.20.0/devpi_server/model.py
--- old/devpi_server-6.19.3/devpi_server/model.py       2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/model.py       2026-04-30 
11:01:49.000000000 +0200
@@ -675,14 +675,20 @@
     stale: bool
 
     def __init__(
-        self, links: Sequence[JoinedLink] | SimpleLinks, *, stale: bool = False
+        self,
+        links: Sequence[JoinedLink] | SimpleLinks,
+        *,
+        core_metadata: bool = False,
+        stale: bool = False,
     ) -> None:
         assert links is not None
         if isinstance(links, SimpleLinks):
             self._links = links._links
             self.stale = links.stale or stale
         else:
-            self._links = [SimplelinkMeta(x) for x in links]
+            self._links = [
+                SimplelinkMeta(x, core_metadata=core_metadata) for x in links
+            ]
             self.stale = stale
 
     def __hash__(self):
@@ -1596,7 +1602,8 @@
         requires_python = cast("RequiresPythonList", 
data.get("requires_python", []))
         yanked: YankedList = []  # PEP 592 isn't supported for private stages 
yet
         return self.SimpleLinks(
-            join_links_data(links, requires_python, yanked))
+            join_links_data(links, requires_python, yanked), core_metadata=True
+        )
 
     def _regen_simplelinks(self, project_input):
         project = normalize_name(project_input)
@@ -2129,6 +2136,7 @@
         "__path",
         "__url",
         "__version",
+        "core_metadata",
         "href",
         "key",
         "require_python",
@@ -2137,7 +2145,12 @@
     __cmpval: tuple | NotSet
     __hashes: Digests | NotSet
 
-    def __init__(self, link_info: tuple[str, str, RequiresPython, Yanked]) -> 
None:
+    def __init__(
+        self,
+        link_info: tuple[str, str, RequiresPython, Yanked],
+        *,
+        core_metadata: bool = False,
+    ) -> None:
         self.__basename = notset
         self.__cmpval = notset
         self.__ext = notset
@@ -2147,7 +2160,10 @@
         self.__path = notset
         self.__url = notset
         self.__version = notset
+        self.core_metadata = False
         (self.key, self.href, self.require_python, self.yanked) = link_info
+        if core_metadata and self.basename.endswith(".whl"):
+            self.core_metadata = True
 
     def __hash__(self) -> int:
         return hash(
@@ -2161,6 +2177,7 @@
                 self.__path,
                 self.__url,
                 self.__version,
+                self.core_metadata,
                 self.href,
                 self.key,
                 self.require_python,
@@ -2283,8 +2300,10 @@
             f"<{clsname} "
             f"key={self.key!r} "
             f"href={self.href!r} "
+            f"core_metadata={self.core_metadata!r} "
             f"require_python={self.require_python!r} "
-            f"yanked={self.yanked!r}>")
+            f"yanked={self.yanked!r}>"
+        )
 
 
 def make_key_and_href(entry):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/proxy.py 
new/devpi_server-6.20.0/devpi_server/proxy.py
--- old/devpi_server-6.19.3/devpi_server/proxy.py       1970-01-01 
01:00:00.000000000 +0100
+++ new/devpi_server-6.20.0/devpi_server/proxy.py       2026-04-30 
11:01:49.000000000 +0200
@@ -0,0 +1,46 @@
+from __future__ import annotations
+
+from typing import TYPE_CHECKING
+from webob.headers import EnvironHeaders
+from webob.headers import ResponseHeaders
+
+
+if TYPE_CHECKING:
+    from httpx import Response
+    from pyramid.request import Request
+
+
+hop_by_hop = frozenset(
+    {
+        "connection",
+        "keep-alive",
+        "proxy-authenticate",
+        "proxy-authorization",
+        "te",
+        "trailers",
+        "transfer-encoding",
+        "upgrade",
+    }
+)
+
+
+def clean_request_headers(request: Request) -> EnvironHeaders:
+    result = EnvironHeaders({})
+    result.update(request.headers)
+    result.pop("host", None)
+    return result
+
+
+def clean_response_headers(response: Response) -> ResponseHeaders:
+    headers = ResponseHeaders()
+    # remove hop by hop headers, see:
+    # https://www.mnot.net/blog/2011/07/11/what_proxies_must_do
+    hop_keys = set(hop_by_hop)
+    connection = response.headers.get("connection")
+    if connection and connection.lower() != "close":
+        hop_keys.update(x.strip().lower() for x in connection.split(","))
+    for k, v in response.headers.items():
+        if k.lower() in hop_keys:
+            continue
+        headers[k] = v
+    return headers
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/replica.py 
new/devpi_server-6.20.0/devpi_server/replica.py
--- old/devpi_server-6.19.3/devpi_server/replica.py     2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/replica.py     2026-04-30 
11:01:49.000000000 +0200
@@ -19,6 +19,8 @@
 from .markers import absent
 from .model import UpstreamError
 from .normalized import normalize_name
+from .proxy import clean_request_headers
+from .proxy import clean_response_headers
 from .views import FileStreamer
 from .views import H_MASTER_UUID
 from .views import H_PRIMARY_UUID
@@ -33,8 +35,6 @@
 from pyramid.view import view_config
 from repoze.lru import LRUCache
 from typing import TYPE_CHECKING
-from webob.headers import EnvironHeaders
-from webob.headers import ResponseHeaders
 import contextlib
 import io
 import itsdangerous
@@ -330,7 +330,8 @@
         start_serial = int(self.request.matchdict["serial"])
 
         keyfs = self.xom.keyfs
-        self._wait_for_serial(start_serial)
+        with self.update_replica_status(start_serial):
+            self._wait_for_serial(start_serial)
         devpi_serial = keyfs.tx.conn.last_changelog_serial
         threadlog.info("Streaming from %s to %s", start_serial, devpi_serial)
 
@@ -1441,40 +1442,6 @@
                     cache_projectnames.add(project)
 
 
-hop_by_hop = frozenset((
-    'connection',
-    'keep-alive',
-    'proxy-authenticate',
-    'proxy-authorization',
-    'te',
-    'trailers',
-    'transfer-encoding',
-    'upgrade'
-))
-
-
-def clean_request_headers(request):
-    result = EnvironHeaders({})
-    result.update(request.headers)
-    result.pop('host', None)
-    return result
-
-
-def clean_response_headers(response):
-    headers = ResponseHeaders()
-    # remove hop by hop headers, see:
-    # https://www.mnot.net/blog/2011/07/11/what_proxies_must_do
-    hop_keys = set(hop_by_hop)
-    connection = response.headers.get('connection')
-    if connection and connection.lower() != 'close':
-        hop_keys.update(x.strip().lower() for x in connection.split(','))
-    for k, v in response.headers.items():
-        if k.lower() in hop_keys:
-            continue
-        headers[k] = v
-    return headers
-
-
 class BodyFileWrapper:
     # required to provide length to prevent transfer-encoding: chunked
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server/views.py 
new/devpi_server-6.20.0/devpi_server/views.py
--- old/devpi_server-6.19.3/devpi_server/views.py       2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server/views.py       2026-04-30 
11:01:49.000000000 +0200
@@ -20,6 +20,7 @@
 from .model import ReadonlyIndex
 from .model import RemoveValue
 from .normalized import normalize_name
+from .proxy import clean_response_headers
 from .readonly import get_mutable_deepcopy
 from collections import defaultdict
 from devpi_common.metadata import get_pyversion_filetype
@@ -56,6 +57,7 @@
 from typing import TYPE_CHECKING
 from typing import cast
 from urllib.parse import urlparse
+from zipfile import ZipFile
 import attrs
 import contextlib
 import devpi_server
@@ -626,6 +628,9 @@
         (orig_request.matchdict, orig_request.matched_route) = (
             info['match'], info['route'])
         if orig_request.matched_route is None:
+            threadlog.debug(
+                "Authcheck Forbidden for %s (%s)", url, 
request.matched_route.name
+            )
             return HTTPForbidden()
         root_factory = orig_request.matched_route.factory or root_factory
         orig_request.context = root_factory(orig_request)
@@ -782,24 +787,29 @@
                    "<strong>%s</strong> are included.</p>"
                    % blocked_index).encode('utf-8')
 
+        core_metadata = self.xom.config.args.enable_core_metadata
         make_url = self._makeurl_factory()
 
         for link in result:
             stage = "/".join(link.href.split("/", 2)[:2])
-            attribs = 'href="%s"' % make_url(link.href).url
+            attribs = [f'href="{make_url(link.href).url}"']
+            if core_metadata and link.core_metadata:
+                attribs.append(
+                    'data-dist-info-metadata="true" data-core-metadata="true"'
+                )
             if link.require_python is not None:
-                attribs += ' data-requires-python="%s"' % 
escape(link.require_python)
+                
attribs.append(f'data-requires-python="{escape(link.require_python)}"')
             if link.yanked is not None and link.yanked is not False:
                 yanked = "" if link.yanked is True else link.yanked
-                attribs += ' data-yanked="%s"' % escape(yanked)
-            data = dict(stage=stage, attribs=attribs, key=link.key)
-            yield "{stage} <a 
{attribs}>{key}</a><br>\n".format(**data).encode("utf-8")
+                attribs.append(f'data-yanked="{escape(yanked)}"')
+            yield f"{stage} <a {' 
'.join(attribs)}>{link.key}</a><br>\n".encode()
 
-        yield "</body></html>".encode("utf-8")
+        yield b"</body></html>"
 
     def _simple_list_project_json_v1(self, stage, project, result, embed_form, 
blocked_index):
         yield 
(f'{{"meta":{{"api-version":"1.0"}},"name":"{project}","files":[').encode("utf-8")
 
+        core_metadata = self.xom.config.args.enable_core_metadata
         make_url = self._makeurl_factory()
 
         first = True
@@ -809,6 +819,8 @@
                 "filename": link.key,
                 "url": url.url_nofrag,
                 "hashes": {url.hash_type: url.hash_value} if url.hash_type 
else {}}
+            if core_metadata and link.core_metadata:
+                data["core-metadata"] = True
             if link.require_python is not None:
                 data["requires-python"] = link.require_python
             if link.yanked is not None and link.yanked is not False:
@@ -1549,21 +1561,38 @@
             relpath = relpath.split("#", 1)[0]
         return relpath
 
-    def _pkgserv(self, entry):
+    def _pkgserv(self, entry, *, is_metadata=False):  # noqa: PLR0911,PLR0912
         request = self.request
         if not entry.meta:
             abort(request, 410, "file existed, deleted in later serial")
 
-        if json_preferred(request):
+        if json_preferred(request) and not is_metadata:
             entry_data = get_mutable_deepcopy(entry.meta)
             apireturn(200, type="releasefilemeta", result=entry_data)
 
         # getting the stage from context will cause 404 if stage is deleted
         stage = self.context.stage
 
+        if not request.has_permission("pkg_read"):
+            return apireturn(403, "package read forbidden")
+
         url = URL(entry.url)
 
         file_exists = entry.file_exists()
+
+        if is_metadata and stage.ixconfig["type"] == "mirror" and not 
file_exists:
+            if not stage.provides_core_metadata:
+                return apireturn(404, "mirror_provides_core_metadata disabled")
+            url = url.replace(path=f"{url.path}.metadata")
+            try:
+                app_iter = iter_stream_remote_file(stage, url)
+                headers = next(app_iter)
+            except BadGateway as e:
+                if e.code == 404:
+                    return apireturn(404, e.args[0])
+                return apireturn(502, e.args[0])
+            return Response(app_iter=app_iter, headers=headers)
+
         if entry.last_modified is None or not file_exists:
             # We check whether we should serve the file directly
             # or redirect to the external URL
@@ -1575,6 +1604,12 @@
                 # we do it in _pkgserv now to avoid storing the credentials
                 # in the database and avoid changes in the db when mirror_url 
changes.
                 mirror_url_auth = getattr(stage, "mirror_url_auth", {})
+                if mirror_url_auth:
+                    threadlog.debug("Returning external URL: %s", url)
+                else:
+                    threadlog.debug(
+                        "Returning external URL with authentication: %s", url
+                    )
                 url = url.replace(**mirror_url_auth)
                 return HTTPFound(location=url.url)
             if stage.ixconfig['type'] != "mirror" and not file_exists and not 
self.xom.is_replica():
@@ -1582,9 +1617,6 @@
                 # replica mode, otherwise fall through to fetch file
                 abort(self.request, 404, "no such file")
 
-        if not request.has_permission("pkg_read"):
-            abort(request, 403, "package read forbidden")
-
         try:
             if should_fetch_remote_file(entry, request.headers):
                 app_iter = iter_fetch_remote_file(stage, entry, url)
@@ -1595,6 +1627,21 @@
                 return apireturn(404, e.args[0])
             return apireturn(502, e.args[0])
 
+        if is_metadata:
+            metadata_filename = (
+                f"{entry.project.replace('-', 
'_')}-{entry.version}.dist-info/METADATA"
+            )
+            with entry.file_open_read() as f, ZipFile(f) as zf:
+                wheel_metadata_contents = zf.read(metadata_filename)
+            return Response(
+                body=wheel_metadata_contents,
+                content_type="application/octet-stream",
+                headers={
+                    "cache-control": "max-age=365000000, immutable, public",
+                    "last-modified": str(entry.last_modified),
+                },
+            )
+
         headers = entry.gethttpheaders()
         if self.request.method == "HEAD":
             return Response(headers=headers)
@@ -1606,6 +1653,11 @@
     @view_config(route_name="/{user}/{index}/+e/{relpath:.*}")
     def mirror_pkgserv(self):
         relpath = self._relpath_from_request()
+        is_metadata = relpath.endswith(".metadata")
+        if is_metadata:
+            if not self.xom.config.args.enable_core_metadata:
+                return apireturn(404, "core-metadata is disabled")
+            relpath = relpath.removesuffix(".metadata")
         # when a release is deleted from a mirror, we update the metadata,
         # hence the key won't exist anymore, but we don't delete the file.
         # We want people to notice that condition by returning a 404, but
@@ -1615,15 +1667,20 @@
         if key is None or not key.exists():
             abort(self.request, 404, "no such file")
         entry = self.xom.filestore.get_file_entry_from_key(key)
-        return self._pkgserv(entry)
+        return self._pkgserv(entry, is_metadata=is_metadata)
 
     @view_config(route_name="/{user}/{index}/+f/{relpath:.*}")
     def stage_pkgserv(self):
         relpath = self._relpath_from_request()
+        is_metadata = relpath.endswith(".metadata")
+        if is_metadata:
+            if not self.xom.config.args.enable_core_metadata:
+                return apireturn(404, "core-metadata is disabled")
+            relpath = relpath.removesuffix(".metadata")
         entry = self.xom.filestore.get_file_entry(relpath)
         if entry is None:
             abort(self.request, 404, "no such file")
-        return self._pkgserv(entry)
+        return self._pkgserv(entry, is_metadata=is_metadata)
 
     @view_config(route_name="/{user}/{index}/+e/{relpath:.*}",
                  permission="del_entry",
@@ -1844,6 +1901,22 @@
                 raise err
 
 
+def iter_stream_remote_file(stage, url):
+    with contextlib.ExitStack() as cstack:
+        r = stage.http.stream(cstack, "GET", url, allow_redirects=True)
+        if r.status_code != 200:
+            r.close()
+            msg = f"error {r.status_code} getting {url}"
+            threadlog.error(msg)
+            raise BadGateway(msg, code=r.status_code, url=url)
+        try:
+            yield clean_response_headers(r)
+            yield from r.iter_raw(10240)
+        except Exception as err:
+            threadlog.error(str(err))
+            raise
+
+
 def iter_cache_remote_file(stage, entry, url):
     # we get and cache the file and some http headers from remote
     xom = stage.xom
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/devpi_server.egg-info/PKG-INFO 
new/devpi_server-6.20.0/devpi_server.egg-info/PKG-INFO
--- old/devpi_server-6.19.3/devpi_server.egg-info/PKG-INFO      2026-04-13 
17:20:53.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server.egg-info/PKG-INFO      2026-04-30 
11:02:47.000000000 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 2.4
 Name: devpi-server
-Version: 6.19.3
+Version: 6.20.0
 Summary: devpi-server: backend for hosting private package indexes and PyPI 
on-demand mirrors
 Maintainer-email: Florian Schulze <[email protected]>
 License-Expression: MIT
@@ -121,6 +121,20 @@
 
 .. towncrier release notes start
 
+6.20.0 (2026-04-30)
+===================
+
+Features
+--------
+
+- Add experimental bare bones core-metadata ([PEP 
658](https://peps.python.org/pep-0658/), [PEP 
714](https://peps.python.org/pep-0714/)) support with 
``--enable-core-metadata`` command line option and 
``mirror_provides_core_metadata`` mirror index option. Refs #1018
+
+Bug Fixes
+---------
+
+- Update replica status when the replica is waiting for new serials using the 
streaming changelog endpoint.
+
+
 6.19.3 (2026-04-13)
 ===================
 
@@ -193,34 +207,3 @@
 
 - Fix #1110: a list for the ``listen`` option in a config file stopped working 
in 6.18.0.
 
-
-6.18.0 (2026-01-27)
-===================
-
-Features
---------
-
-- Store all available hashes of files.
-
-- Validate hashes of all files during devpi-import, not only releases.
-
-Bug Fixes
----------
-
-- Apply argparse transformations on values read from config file or 
environment.
-
-- Restore Python and platform info in user agent string after switch to httpx.
-
-- Remove all database entries on project deletion instead of only emptying 
them.
-
-- Fix error at end of replica streaming caused by changed behavior from switch 
to httpx.
-
-- Fix #1102: The data stream was cut off after 64k when proxying from replica 
to primary after switching to httpx.
-
-- Fix #1107: retry file downloads if there has been an error during download.
-
-Other Changes
--------------
-
-- The filenames of some exported doczip files change due to normalization of 
the project name caused by changing the internals during export to allow 
``--hard-links`` to work.
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/devpi_server-6.19.3/devpi_server.egg-info/SOURCES.txt 
new/devpi_server-6.20.0/devpi_server.egg-info/SOURCES.txt
--- old/devpi_server-6.19.3/devpi_server.egg-info/SOURCES.txt   2026-04-13 
17:20:53.000000000 +0200
+++ new/devpi_server-6.20.0/devpi_server.egg-info/SOURCES.txt   2026-04-30 
11:02:47.000000000 +0200
@@ -42,6 +42,7 @@
 devpi_server/mythread.py
 devpi_server/normalized.py
 devpi_server/passwd.py
+devpi_server/proxy.py
 devpi_server/py.typed
 devpi_server/readonly.py
 devpi_server/replica.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/test_devpi_server/plugin.py 
new/devpi_server-6.20.0/test_devpi_server/plugin.py
--- old/devpi_server-6.19.3/test_devpi_server/plugin.py 2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/test_devpi_server/plugin.py 2026-04-30 
11:01:49.000000000 +0200
@@ -1499,16 +1499,26 @@
     port = get_open_port(host)
     args = [
         "devpi-server",
-        "--role", "primary",
-        "--secretfile", secretfile,
-        "--argon2-memory-cost", str(LOWER_ARGON2_MEMORY_COST),
-        "--argon2-parallelism", str(LOWER_ARGON2_PARALLELISM),
-        "--argon2-time-cost", str(LOWER_ARGON2_TIME_COST),
-        "--host", host,
-        "--port", str(port),
+        "--debug",
+        "--role",
+        "primary",
+        "--secretfile",
+        secretfile,
+        "--argon2-memory-cost",
+        str(LOWER_ARGON2_MEMORY_COST),
+        "--argon2-parallelism",
+        str(LOWER_ARGON2_PARALLELISM),
+        "--argon2-time-cost",
+        str(LOWER_ARGON2_TIME_COST),
+        "--host",
+        host,
+        "--port",
+        str(port),
         "--requests-only",
-        "--serverdir", str(primary_server_path),
-        *storage_args(primary_server_path)]
+        "--serverdir",
+        str(primary_server_path),
+        *storage_args(primary_server_path),
+    ]
     if not primary_server_path.joinpath('.nodeinfo').exists():
         subprocess.check_call(
             [  # noqa: S607
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/test_devpi_server/test_keyfs.py 
new/devpi_server-6.20.0/test_devpi_server/test_keyfs.py
--- old/devpi_server-6.19.3/test_devpi_server/test_keyfs.py     2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/test_devpi_server/test_keyfs.py     2026-04-30 
11:01:49.000000000 +0200
@@ -1045,3 +1045,20 @@
         (relpath_info,) = list(tx.iter_relpaths_at([key], tx.at_serial))
     assert relpath_info.keyname == "NAME1"
     assert relpath_info.value == 1
+
+
+def test_finalize_init(mapp):
+    api1 = mapp.create_and_use()
+    mapp.xom.keyfs.finalize_init()
+    content1 = mapp.makepkg("hello-1.0.tar.gz", b"content1", "hello", "1.0")
+    mapp.upload_file_pypi("hello-1.0.tar.gz", content1, "hello", "1.0")
+    mapp.xom.keyfs.finalize_init()
+    api2 = mapp.create_index("prod")
+    mapp.push("hello", "1.0", api2.stagename, api1.stagename)
+    mapp.xom.keyfs.finalize_init()
+    mapp.use(api1.stagename)
+    mapp.delete_project("hello")
+    mapp.xom.keyfs.finalize_init()
+    mapp.use(api2.stagename)
+    mapp.delete_project("hello")
+    mapp.xom.keyfs.finalize_init()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/test_devpi_server/test_mirror.py 
new/devpi_server-6.20.0/test_devpi_server/test_mirror.py
--- old/devpi_server-6.19.3/test_devpi_server/test_mirror.py    2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/test_devpi_server/test_mirror.py    2026-04-30 
11:01:49.000000000 +0200
@@ -791,6 +791,40 @@
             assert len(pypistage.get_releaselinks("foo")) == 1
             assert pypistage.list_versions("foo") == {"1.0"}
 
+    @pytest.mark.notransaction
+    def test_core_metadata(self, monkeypatch, pypistage, testapp):
+        # the command-line option alone isn't enough for mirrors
+        monkeypatch.setattr(pypistage.xom.config.args, "enable_core_metadata", 
True)
+        pypistage.mock_simple(
+            "foo", text='<a href="foo-1.0-py3-none-any.whl#sha256=1234"></a>'
+        )
+        pypistage.url2response[
+            "https://pypi.org/simple/foo/foo-1.0-py3-none-any.whl.metadata";
+        ] = dict(status_code=200, content=b"metadata")
+        r = testapp.xget(200, "/root/pypi/+simple/foo/")
+        assert "core-metadata" not in r.text
+        r = testapp.xget(
+            200,
+            "/root/pypi/+simple/foo/",
+            headers={"Accept": "application/vnd.pypi.simple.v1+json"},
+        )
+        (file_info,) = r.json["files"]
+        assert "core-metadata" not in file_info
+        r = testapp.xget(404, 
"/root/pypi/+f/123/4/foo-1.0-py3-none-any.whl.metadata")
+        with pypistage.xom.keyfs.write_transaction():
+            pypistage.modify(mirror_provides_core_metadata=True)
+        r = testapp.xget(200, "/root/pypi/+simple/foo/")
+        assert "core-metadata" in r.text
+        r = testapp.xget(
+            200,
+            "/root/pypi/+simple/foo/",
+            headers={"Accept": "application/vnd.pypi.simple.v1+json"},
+        )
+        (file_info,) = r.json["files"]
+        assert file_info["core-metadata"] is True
+        r = testapp.xget(200, 
"/root/pypi/+f/123/4/foo-1.0-py3-none-any.whl.metadata")
+        assert r.body == b"metadata"
+
 
 class TestMirrorStageprojects:
     @pytest.mark.asyncio
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/test_devpi_server/test_views.py 
new/devpi_server-6.20.0/test_devpi_server/test_views.py
--- old/devpi_server-6.19.3/test_devpi_server/test_views.py     2026-04-13 
17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/test_devpi_server/test_views.py     2026-04-30 
11:01:49.000000000 +0200
@@ -460,6 +460,41 @@
     assert 'yanked' not in item2
 
 
+def test_project_pep_691_core_metadata(mapp, monkeypatch, testapp):
+    from zipfile import ZipFile
+
+    api = mapp.create_and_use()
+    content_io = BytesIO()
+    with ZipFile(content_io, "w") as zf:
+        zf.writestr("pkg1-2.6.dist-info/METADATA", b"metadata")
+    content = content_io.getvalue()
+    mapp.upload_file_pypi("pkg1-2.6.whl", content, "pkg1", "2.6")
+    r = testapp.xget(
+        200,
+        f"/{api.stagename}/+simple/pkg1",
+        headers={"Accept": "application/vnd.pypi.simple.v1+json"},
+        follow=False,
+    )
+    assert r.headers["content-type"] == "application/vnd.pypi.simple.v1+json"
+    (file_info,) = r.json["files"]
+    assert "core-metadata" not in file_info
+    metadata_url = URL(api.simpleindex).joinpath(file_info["url"] + 
".metadata")
+    testapp.xget(404, metadata_url)
+    monkeypatch.setattr(mapp.xom.config.args, "enable_core_metadata", True)
+    r = testapp.xget(
+        200,
+        f"/{api.stagename}/+simple/pkg1",
+        headers={"Accept": "application/vnd.pypi.simple.v1+json"},
+        follow=False,
+    )
+    assert r.headers["content-type"] == "application/vnd.pypi.simple.v1+json"
+    (file_info,) = r.json["files"]
+    assert file_info["core-metadata"] is True
+    metadata_url = URL(api.simpleindex).joinpath(file_info["url"] + 
".metadata")
+    r = testapp.xget(200, metadata_url)
+    assert r.body == b"metadata"
+
+
 def test_projects_redirect(pypistage, testapp):
     # test non installer html requests which should go nowhere,
     # because devpi-server only returns json
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/devpi_server-6.19.3/tox.ini 
new/devpi_server-6.20.0/tox.ini
--- old/devpi_server-6.19.3/tox.ini     2026-04-13 17:20:02.000000000 +0200
+++ new/devpi_server-6.20.0/tox.ini     2026-04-30 11:01:49.000000000 +0200
@@ -100,4 +100,4 @@
     -W error:"master_serverdir fixture is deprecated":DeprecationWarning
 asyncio_default_fixture_loop_scope = function
 timeout = 60
-norecursedirs = .tox build
+collect_ignore = .tox build

Reply via email to