Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-devpi-server for
openSUSE:Factory checked in at 2026-04-20 16:11:04
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-devpi-server (Old)
and /work/SRC/openSUSE:Factory/.python-devpi-server.new.11940 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-devpi-server"
Mon Apr 20 16:11:04 2026 rev:19 rq:1347777 version:6.19.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-devpi-server/python-devpi-server.changes
2026-04-07 16:49:18.013510983 +0200
+++
/work/SRC/openSUSE:Factory/.python-devpi-server.new.11940/python-devpi-server.changes
2026-04-20 16:11:10.684858384 +0200
@@ -1,0 +2,11 @@
+Fri Apr 17 17:49:21 UTC 2026 - Dirk Müller <[email protected]>
+
+- update to 6.19.3:
+ * Fix #1112: Parse simple JSON reply even with wrong content-
+ type in reply if the body seems to contain JSON.
+ * Return stale project list for mirrors when the lock can't be
+ acquired within the timeout.
+ * Fix importing of toxresults from devpi-server 6.5.0 to 6.9.0
+ where the wrong hash was stored.
+
+-------------------------------------------------------------------
Old:
----
devpi_server-6.19.2.tar.gz
New:
----
devpi_server-6.19.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-devpi-server.spec ++++++
--- /var/tmp/diff_new_pack.YH0tea/_old 2026-04-20 16:11:11.472890830 +0200
+++ /var/tmp/diff_new_pack.YH0tea/_new 2026-04-20 16:11:11.476890995 +0200
@@ -26,7 +26,7 @@
%{?sle15_python_module_pythons}
Name: python-devpi-server
-Version: 6.19.2
+Version: 6.19.3
Release: 0
Summary: Private PyPI caching server
License: MIT
++++++ devpi_server-6.19.2.tar.gz -> devpi_server-6.19.3.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/.flake8
new/devpi_server-6.19.3/.flake8
--- old/devpi_server-6.19.2/.flake8 2026-03-17 16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/.flake8 2026-04-13 17:20:02.000000000 +0200
@@ -4,5 +4,4 @@
per-file-ignores =
plugin.py:E127,E128,E231
model.py:E128,E231
- test_importexport.py:E121
test*.py:E126,E127,E128,E225,E226,E231,E251
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/CHANGELOG
new/devpi_server-6.19.3/CHANGELOG
--- old/devpi_server-6.19.2/CHANGELOG 2026-03-17 16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/CHANGELOG 2026-04-13 17:20:02.000000000 +0200
@@ -2,6 +2,19 @@
.. towncrier release notes start
+6.19.3 (2026-04-13)
+===================
+
+Bug Fixes
+---------
+
+- Fix #1112: Parse simple JSON reply even with wrong content-type in reply if
the body seems to contain JSON.
+
+- Return stale project list for mirrors when the lock can't be acquired within
the timeout.
+
+- Fix importing of toxresults from devpi-server 6.5.0 to 6.9.0 where the wrong
hash was stored.
+
+
6.19.2 (2026-03-17)
===================
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/CHANGELOG.short.rst
new/devpi_server-6.19.3/CHANGELOG.short.rst
--- old/devpi_server-6.19.2/CHANGELOG.short.rst 2026-03-17 16:02:16.000000000
+0100
+++ new/devpi_server-6.19.3/CHANGELOG.short.rst 2026-04-13 17:20:53.000000000
+0200
@@ -9,6 +9,19 @@
.. towncrier release notes start
+6.19.3 (2026-04-13)
+===================
+
+Bug Fixes
+---------
+
+- Fix #1112: Parse simple JSON reply even with wrong content-type in reply if
the body seems to contain JSON.
+
+- Return stale project list for mirrors when the lock can't be acquired within
the timeout.
+
+- Fix importing of toxresults from devpi-server 6.5.0 to 6.9.0 where the wrong
hash was stored.
+
+
6.19.2 (2026-03-17)
===================
@@ -99,51 +112,3 @@
- The filenames of some exported doczip files change due to normalization of
the project name caused by changing the internals during export to allow
``--hard-links`` to work.
-
-6.17.0 (2025-08-27)
-===================
-
-Deprecations and Removals
--------------------------
-
-- Dropped support for migrating old password hashes that were replaced in
devpi-server 4.2.0.
-
-- Removed support for basic authorization in primary URL. The connection is
already secured by a bearer token header.
-
-- Removed the experimental ``--replica-cert`` option. The replica is already
using a token via a shared secret, so this is redundant.
-
-- Removed ``--replica-max-retries`` option. It wasn't implemented for
async_httpget and didn't work correctly when streaming data.
-
-Features
---------
-
-- Use httpx for all data fetching for mirrors and fetch projects list
asynchronously to allow update in background even after a timeout.
-
-- Use httpx instead of requests when proxying from replicas to primary.
-
-- Use httpx for all requests from replicas to primary.
-
-- Use httpx when pushing releases to external index.
-
-- Added ``mirror_ignore_serial_header`` mirror index option, which allows
switching from PyPI to a mirror without serials header when set to ``True``,
otherwise only stale links will be served and no updates be stored.
-
-- The HTTP cache information for mirrored projects is persisted and re-used on
server restarts.
-
-- Added ``--file-replication-skip-indexes`` option to skip file replication
for ``all``, by index type (i.e. ``mirror``) or index name (i.e. ``root/pypi``).
-
-Bug Fixes
----------
-
-- Correctly handle lists for ``Provides-Extra`` and ``License-File`` metadata
in database.
-
-- Fix traceback by returning 401 error code when using wrong password with a
user that was created using an authentication plugin like devpi-ldap which
passes authentication through in that case.
-
-- Fix #1053: allow users to update their passwords when ``--restrict-modify``
is used.
-
-- Fix #1097: return 404 when trying to POST to +simple.
-
-Other Changes
--------------
-
-- Changed User-Agent when fetching data for mirrors from just "server" to
"devpi-server".
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/PKG-INFO
new/devpi_server-6.19.3/PKG-INFO
--- old/devpi_server-6.19.2/PKG-INFO 2026-03-17 16:02:16.354810000 +0100
+++ new/devpi_server-6.19.3/PKG-INFO 2026-04-13 17:20:53.670446200 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 2.4
Name: devpi-server
-Version: 6.19.2
+Version: 6.19.3
Summary: devpi-server: backend for hosting private package indexes and PyPI
on-demand mirrors
Maintainer-email: Florian Schulze <[email protected]>
License-Expression: MIT
@@ -121,6 +121,19 @@
.. towncrier release notes start
+6.19.3 (2026-04-13)
+===================
+
+Bug Fixes
+---------
+
+- Fix #1112: Parse simple JSON reply even with wrong content-type in reply if
the body seems to contain JSON.
+
+- Return stale project list for mirrors when the lock can't be acquired within
the timeout.
+
+- Fix importing of toxresults from devpi-server 6.5.0 to 6.9.0 where the wrong
hash was stored.
+
+
6.19.2 (2026-03-17)
===================
@@ -211,51 +224,3 @@
- The filenames of some exported doczip files change due to normalization of
the project name caused by changing the internals during export to allow
``--hard-links`` to work.
-
-6.17.0 (2025-08-27)
-===================
-
-Deprecations and Removals
--------------------------
-
-- Dropped support for migrating old password hashes that were replaced in
devpi-server 4.2.0.
-
-- Removed support for basic authorization in primary URL. The connection is
already secured by a bearer token header.
-
-- Removed the experimental ``--replica-cert`` option. The replica is already
using a token via a shared secret, so this is redundant.
-
-- Removed ``--replica-max-retries`` option. It wasn't implemented for
async_httpget and didn't work correctly when streaming data.
-
-Features
---------
-
-- Use httpx for all data fetching for mirrors and fetch projects list
asynchronously to allow update in background even after a timeout.
-
-- Use httpx instead of requests when proxying from replicas to primary.
-
-- Use httpx for all requests from replicas to primary.
-
-- Use httpx when pushing releases to external index.
-
-- Added ``mirror_ignore_serial_header`` mirror index option, which allows
switching from PyPI to a mirror without serials header when set to ``True``,
otherwise only stale links will be served and no updates be stored.
-
-- The HTTP cache information for mirrored projects is persisted and re-used on
server restarts.
-
-- Added ``--file-replication-skip-indexes`` option to skip file replication
for ``all``, by index type (i.e. ``mirror``) or index name (i.e. ``root/pypi``).
-
-Bug Fixes
----------
-
-- Correctly handle lists for ``Provides-Extra`` and ``License-File`` metadata
in database.
-
-- Fix traceback by returning 401 error code when using wrong password with a
user that was created using an authentication plugin like devpi-ldap which
passes authentication through in that case.
-
-- Fix #1053: allow users to update their passwords when ``--restrict-modify``
is used.
-
-- Fix #1097: return 404 when trying to POST to +simple.
-
-Other Changes
--------------
-
-- Changed User-Agent when fetching data for mirrors from just "server" to
"devpi-server".
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server/__init__.py
new/devpi_server-6.19.3/devpi_server/__init__.py
--- old/devpi_server-6.19.2/devpi_server/__init__.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/__init__.py 2026-04-13
17:20:02.000000000 +0200
@@ -1 +1 @@
-__version__ = "6.19.2"
+__version__ = "6.19.3"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server/config.py
new/devpi_server-6.19.3/devpi_server/config.py
--- old/devpi_server-6.19.2/devpi_server/config.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/config.py 2026-04-13
17:20:02.000000000 +0200
@@ -4,6 +4,7 @@
from . import hookspecs
from .interfaces import IIOFileFactory
from .log import threadlog
+from copy import deepcopy
from devpi_common.types import cached_property
from devpi_common.url import URL
from operator import itemgetter
@@ -776,16 +777,11 @@
_io_file_factory: Callable
db_filestore = storage_info.setdefault("db_filestore", True)
settings = storage_info.setdefault("settings", {})
- if db_filestore:
- from .filestore_db import DBIOFile
-
- _io_file_factory = DBIOFile
- else:
- fsbackend = settings.setdefault("fsbackend", "fs")
- _io_file_factory = __import__(
- f"filestore_{fsbackend}", globals=globals(), level=1
- ).fsiofile_factory
-
+ fsbackend = settings.setdefault("fsbackend", "db" if db_filestore else
"fs")
+ _io_file_factory = __import__(
+ f"filestore_{fsbackend}", globals=globals(), level=1
+ ).fsiofile_factory
+ if not db_filestore:
storage_info.setdefault("_test_markers",
[]).append("storage_with_filesystem")
verifyObject(IIOFileFactory, _io_file_factory)
@@ -1147,7 +1143,7 @@
def _storage_info(self):
name = self.storage_info["name"]
settings = self.storage_info["settings"]
- return self._storage_info_from_name(name, settings)
+ return deepcopy(self._storage_info_from_name(name, settings))
@property
def io_file_factory(self) -> IIOFileFactory:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server/filestore_db.py
new/devpi_server-6.19.3/devpi_server/filestore_db.py
--- old/devpi_server-6.19.2/devpi_server/filestore_db.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/filestore_db.py 2026-04-13
17:20:02.000000000 +0200
@@ -80,3 +80,8 @@
def _rollback(self) -> None:
pass
+
+
+@provider(IIOFileFactory)
+def fsiofile_factory(conn: Any, settings: dict) -> DBIOFile:
+ return DBIOFile(conn, settings)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/devpi_server/filestore_hash_hl.py
new/devpi_server-6.19.3/devpi_server/filestore_hash_hl.py
--- old/devpi_server-6.19.2/devpi_server/filestore_hash_hl.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/filestore_hash_hl.py 2026-04-13
17:20:02.000000000 +0200
@@ -43,7 +43,7 @@
basedir: Path = field(kw_only=True)
file_path_info: FilePathInfo = field(kw_only=True)
path: Path = field(kw_only=True)
- digest_path: Path = field(kw_only=True)
+ digest_path: Path | None = field(kw_only=True)
def __repr__(self) -> str:
return f"<{self.__class__.__name__} {self.path}>"
@@ -70,6 +70,7 @@
with suppress(OSError):
self.path.unlink()
digest_path = self.digest_path
+ assert digest_path is not None
if digest_path.exists() and digest_path.stat().st_nlink == 1:
# if nothing else links to the digest file anymore, then remove it
digest_path.unlink()
@@ -87,10 +88,13 @@
def commit(self) -> list[str]:
assert tmpsuffix_for_path(self.path) is not None
digest_path = self.digest_path
+ assert digest_path is not None
if digest_path.exists():
# additional check besides the content digest
assert getsize(digest_path) == self.getsize()
# link to the existing digest file
+ if self.dst_path.exists():
+ raise RuntimeError(f"{self.dst_path} -> {digest_path}")
hardlink(digest_path, self.dst_path)
# drop the temporary file
self.drop()
@@ -112,8 +116,11 @@
@provider(IDirtyFileFactory, IFileFactory)
class HashHLFactory:
@classmethod
- def _make_digest_path(cls, basedir: Path, file_path_info: FilePathInfo) ->
Path:
- assert file_path_info.hash_digest is not None
+ def _make_digest_path(
+ cls, basedir: Path, file_path_info: FilePathInfo
+ ) -> Path | None:
+ if file_path_info.hash_digest is None:
+ return None
return basedir.joinpath("+h",
*split_digest(file_path_info.hash_digest))
@classmethod
@@ -184,6 +191,7 @@
path.unlink()
threadlog.warn("completed file-del from crashed tx: %s",
path)
digest_path = HashHLFactory._make_digest_path(basedir,
file_path_info)
+ assert digest_path is not None
with suppress(OSError):
digest_path.unlink()
threadlog.warn(
@@ -193,6 +201,7 @@
dst = HashHLFactory._make_path(basedir, relpath)
src = basedir / rel_rename
digest_path = HashHLFactory._make_digest_path(basedir,
file_path_info)
+ assert digest_path is not None
if digest_path.exists() and src.exists():
# additional check besides the content digest
assert getsize(digest_path) == getsize(src)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server/importexport.py
new/devpi_server-6.19.3/devpi_server/importexport.py
--- old/devpi_server-6.19.2/devpi_server/importexport.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/importexport.py 2026-04-13
17:20:02.000000000 +0200
@@ -9,12 +9,16 @@
from .main import init_default_indexes
from .main import set_state_version
from .main import xom_from_config
+from .markers import absent
from .model import Rel
from .normalized import normalize_name
from .readonly import ReadonlyView
from .readonly import get_mutable_deepcopy
+from attrs import define
+from attrs import field
from collections import defaultdict
from devpi_common.metadata import BasenameMeta
+from devpi_common.types import parse_hash_spec
from devpi_common.url import URL
from devpi_server import __version__ as server_version
from devpi_server.model import get_stage_customizer_classes
@@ -350,6 +354,81 @@
self.exporter.completed(f"{type}: {relpath} ")
+@define(kw_only=True)
+class Migrator:
+ dumpversion: int = field(converter=int)
+
+ @dumpversion.validator
+ def _validate_dumpversion(self, _attribute, value):
+ if value not in {1, 2}:
+ msg = f"incompatible dumpversion: {self.dumpversion}"
+ raise Fatal(msg)
+
+ def migrate(self, data: dict) -> dict:
+ data["indexes"] = {
+ k: self.migrate_index(v) for k, v in data.pop("indexes").items()
+ }
+ data["users"] = {k: self.migrate_user(v) for k, v in
data.pop("users").items()}
+ return data
+
+ def migrate_file(self, data: dict) -> dict:
+ if self.dumpversion < 2:
+ # previous versions would not add a version attribute
+ data["version"] = BasenameMeta(Path(data["relpath"]).name).version
+ if "entrymapping" in data:
+ mapping = data["entrymapping"]
+ hashes = Digests(mapping.pop("hashes", {}))
+ # devpi-server-2.1 exported with md5 checksums
+ if "md5" in mapping:
+ hashes["md5"] = mapping.pop("md5")
+ # docs and toxresults didn't always have hashes stored in export
dump
+ if "hash_spec" in mapping:
+ hashes.add_spec(mapping.pop("hash_spec"))
+ mapping["hashes"] = dict(hashes)
+ if "for_entrypath" in data:
+ self.migrate_toxresult(data)
+ return data
+
+ def migrate_index(self, data: dict) -> dict:
+ if "files" in data:
+ data["files"] = [self.migrate_file(v) for v in data.pop("files")]
+ indexconfig = data["indexconfig"]
+ if (
+ "uploadtrigger_jenkins" in indexconfig
+ and not indexconfig["uploadtrigger_jenkins"]
+ ):
+ # remove if not set, so if the trigger was never
+ # used, you don't need to install the plugin
+ del indexconfig["uploadtrigger_jenkins"]
+ if "pypi_whitelist" in indexconfig:
+ # this was renamed in 3.0.0
+ whitelist = indexconfig.pop("pypi_whitelist")
+ if "mirror_whitelist" not in indexconfig:
+ indexconfig["mirror_whitelist"] = whitelist
+ return data
+
+ def migrate_toxresult(self, data: dict) -> dict:
+ hash_type = None
+ hash_value = None
+ parts = Path(data["relpath"]).parts
+ if len(parts) == 5:
+ hash_spec = parse_hash_spec(parts[3])
+ if hash_spec[0] is not None:
+ hash_type = hash_spec[0]().name
+ hash_value = hash_spec[1]
+ if (entrymapping := data.get("entrymapping")) is not None:
+ hashes = entrymapping.get("hashes", {})
+ if hashes.get(hash_type, absent) == hash_value:
+ # from 6.5.0 until it was fixed in 6.9.0 the hash for
toxresults
+ # was for the linked file, not for the contents, so we remove
them
+ # here to prevent mismatch errors
+ hashes.clear()
+ return data
+
+ def migrate_user(self, data: dict) -> dict:
+ return data
+
+
class Importer:
import_indexes: dict[str, Any]
@@ -440,11 +519,10 @@
def import_all(self, path: Path) -> None: # noqa: PLR0912
self.import_rootdir = path
json_path = path / "dataindex.json"
- self.import_data = self.read_json(json_path)
- self.dumpversion = self.import_data["dumpversion"]
- if self.dumpversion not in ("1", "2"):
- msg = f"incompatible dumpversion: {self.dumpversion!r}"
- raise Fatal(msg)
+ import_data = self.read_json(json_path)
+ self.import_data =
Migrator(dumpversion=import_data["dumpversion"]).migrate(
+ import_data
+ )
self.import_users = self.import_data["users"]
self.import_indexes = self.import_data["indexes"]
self.display_import_header(path)
@@ -495,16 +573,6 @@
indexconfig = dict(import_index["indexconfig"])
if indexconfig['type'] in self.types_to_skip:
continue
- if 'uploadtrigger_jenkins' in indexconfig:
- if not indexconfig['uploadtrigger_jenkins']:
- # remove if not set, so if the trigger was never
- # used, you don't need to install the plugin
- del indexconfig['uploadtrigger_jenkins']
- if 'pypi_whitelist' in indexconfig:
- # this was renamed in 3.0.0
- whitelist = indexconfig.pop('pypi_whitelist')
- if 'mirror_whitelist' not in indexconfig:
- indexconfig['mirror_whitelist'] = whitelist
username, index = stagename.split("/")
user = self.xom.model.get_user(username)
assert user is not None
@@ -605,22 +673,12 @@
# docs and toxresults didn't always have entrymapping in export dump
mapping = filedesc.get("entrymapping", {})
hashes = Digests(mapping.get("hashes", {}))
- # devpi-server-2.1 exported with md5 checksums
- if "md5" in mapping:
- hashes["md5"] = mapping["md5"]
- # docs and toxresults didn't always have hashes stored in export dump
- if "hash_spec" in mapping:
- hashes.add_spec(mapping['hash_spec'])
# note that the actual hash_type used within devpi-server is not
# determined here but in
store_releasefile/store_doczip/store_toxresult etc
hashes.update(get_hashes(f,
hash_types=hashes.get_missing_hash_types()))
if filedesc["type"] == Rel.ReleaseFile:
- if self.dumpversion == "1":
- # previous versions would not add a version attribute
- version = BasenameMeta(p.name).version
- else:
- version = filedesc["version"]
+ version = filedesc["version"]
if hasattr(stage, 'store_releasefile'):
stage = cast("PrivateStage", stage)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server/keyfs.py
new/devpi_server-6.19.3/devpi_server/keyfs.py
--- old/devpi_server-6.19.2/devpi_server/keyfs.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/keyfs.py 2026-04-13
17:20:02.000000000 +0200
@@ -742,6 +742,9 @@
del self.model_cache[key]
super().delete_stage(username, index)
+ def get_index(self, user: str, index: str | None = None) -> BaseStage |
None:
+ return self.getstage(user, index)
+
def get_user(self, name):
if name not in self.model_cache:
self.model_cache[name] = super().get_user(name)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server/mirror.py
new/devpi_server-6.19.3/devpi_server/mirror.py
--- old/devpi_server-6.19.2/devpi_server/mirror.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/mirror.py 2026-04-13
17:20:02.000000000 +0200
@@ -628,6 +628,9 @@
return self.xom.setdefault_singleton(
self.name, "project_retrieve_times", factory=ProjectUpdateCache)
+ def get_projects_timeout(self, timeout: float | None) -> float:
+ return self.projects_timeout if timeout is None else timeout
+
async def _get_remote_projects(
self, projects_future: asyncio.Future[ProjectsResult]
) -> None:
@@ -656,7 +659,9 @@
)
assert text is not None
parser: ProjectHTMLParser | ProjectJSONv1Parser
- if response.headers.get('content-type') == SIMPLE_API_V1_JSON:
+ if (
+ response.headers.get("content-type") == SIMPLE_API_V1_JSON
+ ) or text.startswith("{"):
parser = ProjectJSONv1Parser(response.url)
parser.feed(json.loads(text))
else:
@@ -675,7 +680,7 @@
def _update_projects(
self, timeout: float | None = None
) -> tuple[dict[NormalizedName, str], bool]:
- projects_timeout = self.projects_timeout if timeout is None else
timeout
+ projects_timeout = self.get_projects_timeout(timeout)
projects_future = cast(
"asyncio.Future[ProjectsResult]", self.xom.create_future()
)
@@ -740,12 +745,18 @@
# try without lock first
if not self.cache_projectnames.is_expired(self.cache_expiry):
return (self.cache_projectnames.get(), False)
- with self._list_projects_perstage_lock:
- # retry in case it was updated in another thread
- if not self.cache_projectnames.is_expired(self.cache_expiry):
- return (self.cache_projectnames.get(), False)
- # no fresh projects or None at all, let's go remote
- return self._update_projects(timeout=timeout)
+ lock = self._list_projects_perstage_lock
+ projects_timeout = self.get_projects_timeout(timeout)
+ if lock.acquire(timeout=projects_timeout):
+ try:
+ # retry in case it was updated in another thread
+ if not self.cache_projectnames.is_expired(self.cache_expiry):
+ return (self.cache_projectnames.get(), False)
+ # no fresh projects or None at all, let's go remote
+ return self._update_projects(timeout=timeout)
+ finally:
+ lock.release()
+ return (self._stale_list_projects_perstage(), True)
def list_projects_perstage(self) -> dict[str, NormalizedName | str]:
""" Return the project names. """
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server/model.py
new/devpi_server-6.19.3/devpi_server/model.py
--- old/devpi_server-6.19.2/devpi_server/model.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/model.py 2026-04-13
17:20:02.000000000 +0200
@@ -236,6 +236,9 @@
del indexes[index]
self.xom.del_singletons(f"{username}/{index}")
+ def get_index(self, user: str, index: str | None = None) -> BaseStage |
None:
+ return self.getstage(user, index)
+
def get_user(self, name: str) -> User | None:
user = User(self, name)
if user.key.exists():
@@ -492,6 +495,9 @@
from .mirror import MirrorStage
return MirrorStage
+ def get_index(self, indexname: str) -> BaseStage | None:
+ return self.getstage(indexname)
+
def _getstage(self, indexname, index_type, ixconfig):
if index_type == "mirror":
cls = self.MirrorStage
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server/replica.py
new/devpi_server-6.19.3/devpi_server/replica.py
--- old/devpi_server-6.19.2/devpi_server/replica.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server/replica.py 2026-04-13
17:20:02.000000000 +0200
@@ -303,7 +303,7 @@
with self.update_replica_status(start_serial):
keyfs = self.xom.keyfs
self._wait_for_serial(start_serial)
- devpi_serial = keyfs.get_current_serial()
+ devpi_serial = keyfs.tx.conn.last_changelog_serial
all_changes = []
raw_size = 0
start_time = time.time()
@@ -331,7 +331,7 @@
keyfs = self.xom.keyfs
self._wait_for_serial(start_serial)
- devpi_serial = keyfs.get_current_serial()
+ devpi_serial = keyfs.tx.conn.last_changelog_serial
threadlog.info("Streaming from %s to %s", start_serial, devpi_serial)
def iter_changelog_entries():
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/devpi_server.egg-info/PKG-INFO
new/devpi_server-6.19.3/devpi_server.egg-info/PKG-INFO
--- old/devpi_server-6.19.2/devpi_server.egg-info/PKG-INFO 2026-03-17
16:02:16.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server.egg-info/PKG-INFO 2026-04-13
17:20:53.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 2.4
Name: devpi-server
-Version: 6.19.2
+Version: 6.19.3
Summary: devpi-server: backend for hosting private package indexes and PyPI
on-demand mirrors
Maintainer-email: Florian Schulze <[email protected]>
License-Expression: MIT
@@ -121,6 +121,19 @@
.. towncrier release notes start
+6.19.3 (2026-04-13)
+===================
+
+Bug Fixes
+---------
+
+- Fix #1112: Parse simple JSON reply even with wrong content-type in reply if
the body seems to contain JSON.
+
+- Return stale project list for mirrors when the lock can't be acquired within
the timeout.
+
+- Fix importing of toxresults from devpi-server 6.5.0 to 6.9.0 where the wrong
hash was stored.
+
+
6.19.2 (2026-03-17)
===================
@@ -211,51 +224,3 @@
- The filenames of some exported doczip files change due to normalization of
the project name caused by changing the internals during export to allow
``--hard-links`` to work.
-
-6.17.0 (2025-08-27)
-===================
-
-Deprecations and Removals
--------------------------
-
-- Dropped support for migrating old password hashes that were replaced in
devpi-server 4.2.0.
-
-- Removed support for basic authorization in primary URL. The connection is
already secured by a bearer token header.
-
-- Removed the experimental ``--replica-cert`` option. The replica is already
using a token via a shared secret, so this is redundant.
-
-- Removed ``--replica-max-retries`` option. It wasn't implemented for
async_httpget and didn't work correctly when streaming data.
-
-Features
---------
-
-- Use httpx for all data fetching for mirrors and fetch projects list
asynchronously to allow update in background even after a timeout.
-
-- Use httpx instead of requests when proxying from replicas to primary.
-
-- Use httpx for all requests from replicas to primary.
-
-- Use httpx when pushing releases to external index.
-
-- Added ``mirror_ignore_serial_header`` mirror index option, which allows
switching from PyPI to a mirror without serials header when set to ``True``,
otherwise only stale links will be served and no updates be stored.
-
-- The HTTP cache information for mirrored projects is persisted and re-used on
server restarts.
-
-- Added ``--file-replication-skip-indexes`` option to skip file replication
for ``all``, by index type (i.e. ``mirror``) or index name (i.e. ``root/pypi``).
-
-Bug Fixes
----------
-
-- Correctly handle lists for ``Provides-Extra`` and ``License-File`` metadata
in database.
-
-- Fix traceback by returning 401 error code when using wrong password with a
user that was created using an authentication plugin like devpi-ldap which
passes authentication through in that case.
-
-- Fix #1053: allow users to update their passwords when ``--restrict-modify``
is used.
-
-- Fix #1097: return 404 when trying to POST to +simple.
-
-Other Changes
--------------
-
-- Changed User-Agent when fetching data for mirrors from just "server" to
"devpi-server".
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/devpi_server.egg-info/SOURCES.txt
new/devpi_server-6.19.3/devpi_server.egg-info/SOURCES.txt
--- old/devpi_server-6.19.2/devpi_server.egg-info/SOURCES.txt 2026-03-17
16:02:16.000000000 +0100
+++ new/devpi_server-6.19.3/devpi_server.egg-info/SOURCES.txt 2026-04-13
17:20:53.000000000 +0200
@@ -112,10 +112,14 @@
test_devpi_server/importexportdata/badusername/dataindex.json
test_devpi_server/importexportdata/basescycle/dataindex.json
test_devpi_server/importexportdata/createdmodified/dataindex.json
+test_devpi_server/importexportdata/dashes_v1/dataindex.json
+test_devpi_server/importexportdata/dashes_v1/user1/dev/hello/hello-1.2_3.tar.gz
test_devpi_server/importexportdata/deletedbase/dataindex.json
test_devpi_server/importexportdata/mirrordata/dataindex.json
test_devpi_server/importexportdata/mirrordata/root/pypi/dddttt/0.1.dev1/dddttt-0.1.dev1.tar.gz
test_devpi_server/importexportdata/modifiedpypi/dataindex.json
+test_devpi_server/importexportdata/no_history_log/dataindex.json
+test_devpi_server/importexportdata/no_history_log/user1/dev/hello/hello-1.0.tar.gz
test_devpi_server/importexportdata/nocreatedmodified/dataindex.json
test_devpi_server/importexportdata/normalization/dataindex.json
test_devpi_server/importexportdata/normalization/root/dev/hello.pkg/hello.pkg-1.0.tar.gz
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/test_devpi_server/functional.py
new/devpi_server-6.19.3/test_devpi_server/functional.py
--- old/devpi_server-6.19.2/test_devpi_server/functional.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/functional.py 2026-04-13
17:20:02.000000000 +0200
@@ -168,20 +168,6 @@
if m: # only server-side mapp returns messages
assert "not/exists" in m
- def test_pypi_index_attributes(self, mapp):
- mapp.login_root()
- data = mapp.getjson("/root/pypi?no_projects=")
- res = data["result"]
- res.pop("projects", None)
- assert sorted(res.keys()) == sorted([
- "type", "volatile", "title", "mirror_url", "mirror_web_url_fmt"])
- assert res["type"] == "mirror"
- assert res["volatile"] is False
- assert res["title"] == "PyPI"
- assert 'pypi' in res["mirror_url"]
- assert 'pypi' in res["mirror_web_url_fmt"]
- assert '{name}' in res["mirror_web_url_fmt"]
-
def test_create_index_base_empty(self, mapp):
indexconfig = dict(bases="")
mapp.login_root()
@@ -189,12 +175,6 @@
data = mapp.getjson("/root/empty")
assert not data["result"]["bases"]
- def test_create_index_base_normalized(self, mapp):
- indexconfig = dict(bases=("/root/pypi",))
- mapp.login_root()
- mapp.create_index("root/hello", indexconfig=indexconfig,
- code=200)
-
def test_create_index_base_invalid(self, mapp):
mapp.login_root()
indexconfig = dict(bases=("/root/dev/123",))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/importexportdata/dashes_v1/dataindex.json
new/devpi_server-6.19.3/test_devpi_server/importexportdata/dashes_v1/dataindex.json
---
old/devpi_server-6.19.2/test_devpi_server/importexportdata/dashes_v1/dataindex.json
1970-01-01 01:00:00.000000000 +0100
+++
new/devpi_server-6.19.3/test_devpi_server/importexportdata/dashes_v1/dataindex.json
2026-04-13 17:20:02.000000000 +0200
@@ -0,0 +1,69 @@
+{
+ "dumpversion": "1",
+ "secret": "qREGpVy0mj2auDp/z/7JpQe/as9XJQl3GZGW75SSH9U=",
+ "pythonversion": [
+ 3,
+ 11,
+ 13,
+ "final",
+ 0
+ ],
+ "devpi_server": "1.2",
+ "indexes": {
+ "user1/dev": {
+ "projects": {
+ "hello": {
+ "1.2-3": {
+ "author": "",
+ "home_page": "",
+ "version": "1.2-3",
+ "keywords": "",
+ "name": "hello",
+ "classifiers": [],
+ "download_url": "",
+ "author_email": "",
+ "license": "",
+ "platform": [],
+ "summary": "",
+ "description": ""
+ }
+ }
+ },
+ "files": [
+ {
+ "entrymapping": {
+ "last_modified": "Fri, 04 Jul 2014 14:40:13 GMT",
+ "md5": "9a0364b9e99bb480dd25e1f0284c8555",
+ "size": "7"
+ },
+ "projectname": "hello",
+ "type": "releasefile",
+ "relpath": "user1/dev/hello/hello-1.2_3.tar.gz"
+ }
+ ],
+ "indexconfig": {
+ "uploadtrigger_jenkins": null,
+ "volatile": true,
+ "bases": [
+ "root/pypi"
+ ],
+ "acl_upload": [
+ "user1"
+ ],
+ "type": "stage"
+ }
+ }
+ },
+ "users": {
+ "root": {
+ "pwhash":
"265ed9fb83bef361764838b7099e9627570016629db4e8e1b930817b1a4793af",
+ "username": "root",
+ "pwsalt": "A/4FsRp5oTkovbtTfhlx1g=="
+ },
+ "user1": {
+ "username": "user1",
+ "pwsalt": "RMAM7ycp8aqw4vytBOBEKA==",
+ "pwhash":
"d9f98f41f8cbdeb6a30a7b6c376d0ccdd76e862ad1fa508b79d4c2098cc9d69a"
+ }
+ }
+}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/importexportdata/dashes_v1/user1/dev/hello/hello-1.2_3.tar.gz
new/devpi_server-6.19.3/test_devpi_server/importexportdata/dashes_v1/user1/dev/hello/hello-1.2_3.tar.gz
---
old/devpi_server-6.19.2/test_devpi_server/importexportdata/dashes_v1/user1/dev/hello/hello-1.2_3.tar.gz
1970-01-01 01:00:00.000000000 +0100
+++
new/devpi_server-6.19.3/test_devpi_server/importexportdata/dashes_v1/user1/dev/hello/hello-1.2_3.tar.gz
2026-04-13 17:20:02.000000000 +0200
@@ -0,0 +1 @@
+content
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/importexportdata/no_history_log/dataindex.json
new/devpi_server-6.19.3/test_devpi_server/importexportdata/no_history_log/dataindex.json
---
old/devpi_server-6.19.2/test_devpi_server/importexportdata/no_history_log/dataindex.json
1970-01-01 01:00:00.000000000 +0100
+++
new/devpi_server-6.19.3/test_devpi_server/importexportdata/no_history_log/dataindex.json
2026-04-13 17:20:02.000000000 +0200
@@ -0,0 +1,83 @@
+{
+ "users": {
+ "root": {
+ "username": "root",
+ "pwsalt": "ACs/Jhs5Tt7jKCV4xAjFzQ==",
+ "pwhash":
"55d0627f48422ba020337d40fbabaa684be46c47a4e53f306121fd216d9bbbaf"
+ },
+ "user1": {
+ "username": "user1",
+ "email": "[email protected]",
+ "pwsalt": "NYDXeETIJmAxQhMBgg3oWw==",
+ "pwhash":
"fce28cd56a2c6028a54133007fea8afe6ed8f3657722b213fcb19ef339b8efc6"
+ }
+ },
+ "devpi_server": "2.0.6",
+ "pythonversion": [
+ 2,
+ 7,
+ 6,
+ "final",
+ 0
+ ],
+ "secret": "xtOAH1d8ZPhWNTMmWUdZrp9pa0urEq4Qvc7itn5SCWE=",
+ "dumpversion": "2",
+ "indexes": {
+ "user1/dev": {
+ "files": [
+ {
+ "projectname": "hello",
+ "version": "1.0",
+ "entrymapping": {
+ "projectname": "hello",
+ "version": "1.0",
+ "last_modified": "Fri, 12 Sep 2014 13:18:55 GMT",
+ "md5": "9a0364b9e99bb480dd25e1f0284c8555"
+ },
+ "type": "releasefile",
+ "relpath": "user1/dev/hello/hello-1.0.tar.gz"
+ },
+ {
+ "projectname": "hello",
+ "version": "1.0",
+ "type": "toxresult",
+ "for_entrypath": "user1/dev/+f/9a0/364b9e99bb480/hello-1.0.tar.gz",
+ "relpath":
"user1/dev/hello/9a0364b9e99bb480dd25e1f0284c8555/hello-1.0.tar.gz.toxresult0"
+ }
+ ],
+ "indexconfig": {
+ "bases": [
+ "root/pypi"
+ ],
+ "pypi_whitelist": [
+ "hello"
+ ],
+ "acl_upload": [
+ "user1"
+ ],
+ "uploadtrigger_jenkins": null,
+ "volatile": true,
+ "type": "stage"
+ },
+ "projects": {
+ "hello": {
+ "1.0": {
+ "description": "",
+ "license": "",
+ "author": "",
+ "download_url": "",
+ "summary": "",
+ "author_email": "",
+ "version": "1.0",
+ "platform": [],
+ "home_page": "",
+ "keywords": "",
+ "classifiers": [],
+ "name": "hello"
+ }
+ }
+ }
+ }
+ },
+ "uuid": "72f86a504b14446e98ba840d0f4609ec"
+}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/importexportdata/no_history_log/user1/dev/hello/hello-1.0.tar.gz
new/devpi_server-6.19.3/test_devpi_server/importexportdata/no_history_log/user1/dev/hello/hello-1.0.tar.gz
---
old/devpi_server-6.19.2/test_devpi_server/importexportdata/no_history_log/user1/dev/hello/hello-1.0.tar.gz
1970-01-01 01:00:00.000000000 +0100
+++
new/devpi_server-6.19.3/test_devpi_server/importexportdata/no_history_log/user1/dev/hello/hello-1.0.tar.gz
2026-04-13 17:20:02.000000000 +0200
@@ -0,0 +1 @@
+content
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/test_devpi_server/plugin.py
new/devpi_server-6.19.3/test_devpi_server/plugin.py
--- old/devpi_server-6.19.2/test_devpi_server/plugin.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/plugin.py 2026-04-13
17:20:02.000000000 +0200
@@ -277,25 +277,34 @@
@pytest.fixture(scope="session")
def storage_info(request, storage_plugin):
+ settings = {}
fsbackend = getattr(request.config.option,
"devpi_server_storage_fs_backend", None)
- if fsbackend is None:
- fsbackend = "fs"
- return storage_plugin.devpiserver_storage_backend(
- settings=dict(fsbackend=fsbackend)
- )
+ if fsbackend is not None:
+ settings["fsbackend"] = fsbackend
+ return storage_plugin.devpiserver_storage_backend(settings=settings)
@pytest.fixture(scope="session")
def storage_args(storage_info):
def storage_args(basedir):
args = []
- if storage_info["name"] != "sqlite":
+ if storage_info["name"] != "sqlite" or storage_info["settings"]:
storage_option = "--storage=%s" % storage_info["name"]
_get_test_storage_options = getattr(
storage_info["storage"], "_get_test_storage_options", None)
+ got_options = False
if _get_test_storage_options:
storage_options = _get_test_storage_options(str(basedir))
+ got_options = storage_options
storage_option = storage_option + storage_options
+ settings = storage_info.get("settings", {})
+ if settings:
+ storage_options = ",".join(f"{k}={v}" for k, v in
settings.items())
+ storage_option = (
+ f"{storage_option},{storage_options}"
+ if got_options
+ else f"{storage_option}:{storage_options}"
+ )
args.append(storage_option)
return args
return storage_args
@@ -317,7 +326,7 @@
def makexom(
request, gen_path, http, httpget, monkeypatch, storage_args, storage_plugin
):
- def makexom(opts=(), http=http, httpget=httpget, plugins=()): # noqa:
PLR0912
+ def makexom(opts=(), http=http, httpget=httpget, plugins=()):
from devpi_server import auth_basic
from devpi_server import auth_devpi
from devpi_server import model
@@ -328,23 +337,34 @@
plugins = [
plugin[0] if isinstance(plugin, tuple) else plugin
for plugin in plugins]
- default_plugins = [
- auth_basic, auth_devpi, mirror, model, replica, view_auth, views,
- storage_plugin]
- for plugin in default_plugins:
- if plugin not in plugins:
- plugins.append(plugin)
+ plugins.extend(
+ plugin
+ for plugin in (
+ auth_basic,
+ auth_devpi,
+ mirror,
+ model,
+ replica,
+ view_auth,
+ views,
+ storage_plugin,
+ )
+ if plugin not in plugins
+ )
pm = get_pluginmanager(load_entrypoints=False)
for plugin in plugins:
pm.register(plugin)
+ no_storage_option =
request.node.get_closest_marker("no_storage_option")
serverdir = gen_path()
- if "--serverdir" in opts:
- fullopts = ["devpi-server"] + list(opts)
- else:
- fullopts = ["devpi-server", "--serverdir", serverdir] + list(opts)
- if not request.node.get_closest_marker("no_storage_option"):
- fullopts.extend(storage_args(serverdir))
- fullopts = [str(x) for x in fullopts]
+ fullopts = [
+ str(x)
+ for x in (
+ "devpi-server",
+ *([] if "--serverdir" in opts else ["--serverdir", serverdir]),
+ *opts,
+ *([] if no_storage_option else storage_args(serverdir)),
+ )
+ ]
config = parseoptions(pm, fullopts)
config.init_nodeinfo()
for marker in ("storage_with_filesystem",):
@@ -360,9 +380,20 @@
add_pypistage_mocks(monkeypatch, http, httpget)
request.addfinalizer(xom.thread_pool.kill)
request.addfinalizer(xom._close_sessions)
- if not request.node.get_closest_marker("no_storage_option"):
+ if not no_storage_option:
assert storage_plugin.__name__ in {
x.__module__ for x in xom.keyfs._storage.__class__.__mro__}
+ fsbackend = getattr(
+ request.config.option, "devpi_server_storage_fs_backend", None
+ )
+ if fsbackend is not None:
+ with xom.keyfs.get_connection() as conn:
+ assert (
+ xom.keyfs.io_file_factory(conn)
+ .__module__.split(".")[-1]
+ .removeprefix("filestore_")
+ == fsbackend
+ )
# verify storage interface
with xom.keyfs.get_connection() as conn:
verify_connection_interface(conn)
@@ -1479,10 +1510,15 @@
"--serverdir", str(primary_server_path),
*storage_args(primary_server_path)]
if not primary_server_path.joinpath('.nodeinfo').exists():
- subprocess.check_call([ # noqa: S607
- "devpi-init",
- "--serverdir", str(primary_server_path),
- *storage_args(primary_server_path)])
+ subprocess.check_call(
+ [ # noqa: S607
+ "devpi-init",
+ "--serverdir",
+ str(primary_server_path),
+ "--no-root-pypi",
+ *storage_args(primary_server_path),
+ ]
+ )
p = subprocess.Popen(args)
try:
wait_for_port(host, port)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/test_importexport.py
new/devpi_server-6.19.3/test_devpi_server/test_importexport.py
--- old/devpi_server-6.19.2/test_devpi_server/test_importexport.py
2026-03-17 16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/test_importexport.py
2026-04-13 17:20:02.000000000 +0200
@@ -1,3 +1,4 @@
+from contextlib import contextmanager
from devpi_common.archive import Archive
from devpi_common.archive import zip_dict
from devpi_common.metadata import Version
@@ -19,6 +20,7 @@
import os
import pytest
import re
+import shutil
import sys
@@ -224,6 +226,13 @@
set_state_version(self.mapp1.xom.config)
self.options = options
+ def copy_testdata(self, name):
+ files = importlib.resources.files("test_devpi_server")
+ with importlib.resources.as_file(
+ files / "importexportdata" / name
+ ) as path:
+ shutil.copytree(path, self.exportdir, dirs_exist_ok=True)
+
def export(self):
from devpi_server.importexport import export
argv = [
@@ -268,6 +277,19 @@
if plugin is not None:
mapp2.xom.config.pluginmanager.register(plugin)
return mapp2
+
+ @contextmanager
+ def update_dataindex_json(self):
+ path = self.exportdir.joinpath("dataindex.json")
+ if path.exists():
+ with path.open() as f:
+ data = json.load(f)
+ else:
+ data = {}
+ yield data
+ with path.open("w") as f:
+ json.dump(data, f)
+
return ImpExp
@pytest.fixture
@@ -281,12 +303,13 @@
hashes = get_hashes(content)
mapp1.upload_file_pypi("hello-1.0.tar.gz", content, "hello", "1.0")
impexp.export()
- data =
json.loads(impexp.exportdir.joinpath('dataindex.json').read_bytes())
- (filedata,) = data['indexes'][api1.stagename]['files']
- assert filedata["entrymapping"].pop("hash_spec") ==
hashes.get_default_spec()
- filedata["entrymapping"].pop("hashes")
- filedata['entrymapping']['md5'] = 'foo'
-
impexp.exportdir.joinpath('dataindex.json').write_text(json.dumps(data))
+ with impexp.update_dataindex_json() as data:
+ (filedata,) = data["indexes"][api1.stagename]["files"]
+ assert (
+ filedata["entrymapping"].pop("hash_spec") ==
hashes.get_default_spec()
+ )
+ filedata["entrymapping"].pop("hashes")
+ filedata["entrymapping"]["md5"] = "foo"
with pytest.raises(
Fatal,
match=re.escape(
@@ -400,6 +423,39 @@
indexlist = mapp2.getindexlist(api.user)
assert indexlist[api.stagename]["mirror_whitelist"] == ["*"]
+ def test_indexes_mirror_whitelist_project(self, impexp):
+ mapp1 = impexp.mapp1
+ api = mapp1.create_and_use()
+ mapp1.upload_file_pypi("hello-1.0.tar.gz", b"content1", "hello", "1.0")
+ mapp1.set_mirror_whitelist("hello")
+ impexp.export()
+ mapp2 = impexp.new_import()
+ assert api.user in mapp2.getuserlist()
+ indexlist = mapp2.getindexlist(api.user)
+ assert indexlist[api.stagename]["mirror_whitelist"] == ["hello"]
+
+ def test_indexes_pypi_whitelist(self, impexp):
+ mapp1 = impexp.mapp1
+ api1 = mapp1.create_and_use()
+ api2 = mapp1.create_and_use()
+ impexp.export()
+ with impexp.update_dataindex_json() as data:
+ for k in list(data["indexes"][api1.stagename]["indexconfig"]):
+ if k.startswith("mirror_"):
+ data["indexes"][api1.stagename]["indexconfig"].pop(k)
+ data["indexes"][api1.stagename]["indexconfig"]["pypi_whitelist"] =
[]
+ for k in list(data["indexes"][api2.stagename]["indexconfig"]):
+ if k.startswith("mirror_"):
+ data["indexes"][api2.stagename]["indexconfig"].pop(k)
+ data["indexes"][api2.stagename]["indexconfig"]["pypi_whitelist"] =
["*"]
+ mapp2 = impexp.new_import()
+ assert api1.user in mapp2.getuserlist()
+ indexlist1 = mapp2.getindexlist(api1.user)
+ assert indexlist1[api1.stagename]["mirror_whitelist"] == []
+ assert api2.user in mapp2.getuserlist()
+ indexlist2 = mapp2.getindexlist(api2.user)
+ assert indexlist2[api2.stagename]["mirror_whitelist"] == ["*"]
+
@pytest.mark.parametrize('acltype', ('upload', 'toxresult_upload'))
def test_indexes_acl(self, impexp, acltype):
mapp1 = impexp.mapp1
@@ -469,6 +525,19 @@
else:
assert stage.ixconfig == _pypi_ixconfig_default
+ def test_import_user_with_extra_data(self, impexp):
+ mapp1 = impexp.mapp1
+ api1 = mapp1.create_and_use()
+ impexp.export()
+ with impexp.update_dataindex_json() as data:
+ data["users"][api1.user]["gobbledygook"] = "foo"
+ mapp2 = impexp.new_import()
+ with mapp2.xom.keyfs.read_transaction():
+ user = mapp2.xom.model.get_user(api1.user)
+ # unknown keys are not shown by default
+ assert "gobbledygook" not in user.get()
+ assert user.get(credentials=True)["gobbledygook"] == "foo"
+
@pytest.mark.parametrize("norootpypi", [False, True])
def test_import_no_root_pypi(self, impexp, norootpypi):
from devpi_server.main import _pypi_ixconfig_default
@@ -661,60 +730,42 @@
# we expect the name from the imported data
assert toxresult.basename == "hello-0.9.tar.gz.toxresult0"
+ def test_import_with_wrong_toxresult_hash(self, impexp, tox_result_data):
+ mapp1 = impexp.mapp1
+ api = mapp1.create_and_use()
+ content = b"content"
+ mapp1.upload_file_pypi("hello-1.0.tar.gz", content, "hello", "1.0")
+ (path,) = mapp1.get_release_paths("hello")
+ path = path.strip("/")
+ toxresult_dump = json.dumps(tox_result_data)
+ toxresult_hashes = get_hashes(toxresult_dump.encode())
+ toxresult_hash = toxresult_hashes.get_default_value()
+ r = mapp1.upload_toxresult(f"/{path}", toxresult_dump)
+ toxresult_path = f"/{r.json['result']}"
+ impexp.export()
+ with impexp.update_dataindex_json() as data:
+ files = data["indexes"][api.stagename]["files"]
+ (releasefile,) = (x for x in files if x["type"] == "releasefile")
+ (toxresult,) = (x for x in files if x["type"] == "toxresult")
+ toxresult["entrymapping"]["hash_spec"] =
releasefile["entrymapping"][
+ "hash_spec"
+ ]
+ toxresult["entrymapping"]["hashes"] =
releasefile["entrymapping"]["hashes"]
+ mapp2 = impexp.new_import()
+ toxresult_link = mapp2.getjson(toxresult_path)["result"]
+ (_hash_algo, hash_value) = parse_hash_spec(toxresult_link["hash_spec"])
+ assert toxresult_link["hashes"] == toxresult_hashes
+ assert hash_value == toxresult_hash
+
def test_import_without_history_log(self, impexp, tox_result_data):
- DUMP_FILE = {
- "users": {
- "root": {
- "username": "root",
- "pwsalt": "ACs/Jhs5Tt7jKCV4xAjFzQ==",
- "pwhash":
"55d0627f48422ba020337d40fbabaa684be46c47a4e53f306121fd216d9bbbaf"
- },
- "user1": {
- "username": "user1", "email": "[email protected]",
- "pwsalt": "NYDXeETIJmAxQhMBgg3oWw==",
- "pwhash":
"fce28cd56a2c6028a54133007fea8afe6ed8f3657722b213fcb19ef339b8efc6"
- }
- },
- "devpi_server": "2.0.6", "pythonversion": [2, 7, 6, "final", 0],
- "secret": "xtOAH1d8ZPhWNTMmWUdZrp9pa0urEq4Qvc7itn5SCWE=",
- "dumpversion": "2",
- "indexes": {
- "user1/dev": {
- "files": [
- {
- "projectname": "hello", "version": "1.0",
- "entrymapping": {
- "projectname": "hello", "version": "1.0",
- "last_modified": "Fri, 12 Sep 2014 13:18:55 GMT",
- "md5": "9a0364b9e99bb480dd25e1f0284c8555"},
- "type": "releasefile", "relpath":
"user1/dev/hello/hello-1.0.tar.gz"},
- {
- "projectname": "hello", "version": "1.0", "type":
"toxresult",
- "for_entrypath":
"user1/dev/+f/9a0/364b9e99bb480/hello-1.0.tar.gz",
- "relpath":
"user1/dev/hello/9a0364b9e99bb480dd25e1f0284c8555/hello-1.0.tar.gz.toxresult0"}
- ],
- "indexconfig": {
- "bases": ["root/pypi"], "pypi_whitelist": ["hello"],
- "acl_upload": ["user1"], "uploadtrigger_jenkins": None,
- "volatile": True, "type": "stage"},
- "projects": {
- "hello": {
- "1.0": {
- "description": "", "license": "", "author": "",
"download_url": "",
- "summary": "", "author_email": "", "version": "1.0",
"platform": [],
- "home_page": "", "keywords": "", "classifiers": [],
"name": "hello"}}}
- }
- },
- "uuid": "72f86a504b14446e98ba840d0f4609ec"
- }
-
impexp.exportdir.joinpath('dataindex.json').write_text(json.dumps(DUMP_FILE))
-
- filedir = impexp.exportdir
- for dir in ['user1', 'dev', 'hello']:
- filedir = filedir / dir
- filedir.mkdir()
- filedir.joinpath('hello-1.0.tar.gz').write_text('content')
- filedir = filedir / '9a0364b9e99bb480dd25e1f0284c8555'
+ impexp.copy_testdata("no_history_log")
+ filedir = (
+ impexp.exportdir
+ / "user1"
+ / "dev"
+ / "hello"
+ / "9a0364b9e99bb480dd25e1f0284c8555"
+ )
filedir.mkdir()
filedir.joinpath('hello-1.0.tar.gz.toxresult0').write_text(json.dumps(tox_result_data))
@@ -823,85 +874,9 @@
In this case the Registration entry won't match the inferred
version
data for the file.
"""
- mapp1 = impexp.mapp1
- api = mapp1.create_and_use()
-
- # This is the raw json of the data that shows up this issue.
- DUMP_FILE = {
- "dumpversion": "1",
- "secret": "qREGpVy0mj2auDp/z/7JpQe/as9XJQl3GZGW75SSH9U=",
- "pythonversion": list(sys.version_info),
- "devpi_server": "1.2",
- "indexes": {
- "user1/dev": {
- "projects": {
- "hello": {
- "1.2-3": {
- "author": "",
- "home_page": "",
- "version": "1.2-3",
- "keywords": "",
- "name": "hello",
- "classifiers": [],
- "download_url": "",
- "author_email": "",
- "license": "",
- "platform": [],
- "summary": "",
- "description": "",
- },
- },
- },
- "files": [
- {
- "entrymapping": {
- "last_modified": "Fri, 04 Jul 2014 14:40:13 GMT",
- "md5": "9a0364b9e99bb480dd25e1f0284c8555",
- "size": "7"
- },
- "projectname": "hello",
- "type": "releasefile",
- "relpath": "user1/dev/hello/hello-1.2_3.tar.gz"
- },
- ],
- "indexconfig": {
- "uploadtrigger_jenkins": None,
- "volatile": True,
- "bases": [
- "root/pypi"
- ],
- "acl_upload": [
- "user1"
- ],
- "type": "stage"
- },
- },
- },
- "users": {
- "root": {
- "pwhash":
"265ed9fb83bef361764838b7099e9627570016629db4e8e1b930817b1a4793af",
- "username": "root",
- "pwsalt": "A/4FsRp5oTkovbtTfhlx1g=="
- },
- "user1": {
- "username": "user1",
- "pwsalt": "RMAM7ycp8aqw4vytBOBEKA==",
- "pwhash":
"d9f98f41f8cbdeb6a30a7b6c376d0ccdd76e862ad1fa508b79d4c2098cc9d69a"
- }
- }
- }
-
impexp.exportdir.joinpath('dataindex.json').write_text(json.dumps(DUMP_FILE))
-
- filedir = impexp.exportdir
- for dir in ['user1', 'dev', 'hello']:
- filedir = filedir / dir
- filedir.mkdir()
- filedir.joinpath('hello-1.2_3.tar.gz').write_text('content')
-
- # Run the import and check the version data
- mapp2 = impexp.new_import()
+ mapp2 = impexp.import_testdata("dashes_v1")
with mapp2.xom.keyfs.read_transaction():
- stage = mapp2.xom.model.getstage(api.stagename)
+ stage = mapp2.xom.model.getstage("user1/dev")
verdata = stage.get_versiondata_perstage("hello", "1.2-3")
assert verdata["version"] == "1.2-3"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/test_devpi_server/test_mirror.py
new/devpi_server-6.19.3/test_devpi_server/test_mirror.py
--- old/devpi_server-6.19.2/test_devpi_server/test_mirror.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/test_mirror.py 2026-04-13
17:20:02.000000000 +0200
@@ -821,11 +821,15 @@
"django": "Django"}
@pytest.mark.asyncio
- async def test_get_remote_projects_pep691_json(self, pypistage):
+ @pytest.mark.parametrize(
+ "content_type",
+ ["application/vnd.pypi.simple.v1+json", "application/octet-stream"],
+ )
+ async def test_get_remote_projects_pep691_json(self, content_type,
pypistage):
pypistage.xom.http.mockresponse(
pypistage.mirror_url,
code=200,
- content_type="application/vnd.pypi.simple.v1+json",
+ content_type=content_type,
text="""{
"meta": {"api-version": "1.0"},
"projects": [
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/devpi_server-6.19.2/test_devpi_server/test_model.py
new/devpi_server-6.19.3/test_devpi_server/test_model.py
--- old/devpi_server-6.19.2/test_devpi_server/test_model.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/test_model.py 2026-04-13
17:20:02.000000000 +0200
@@ -232,6 +232,25 @@
assert not stage.has_project("someproject")
assert not stage.list_projects_perstage()
+ def test_inheritance_cycle(self, model, pypistage):
+ pypistage.mock_simple("someproject", "<a href='someproject-1.0.zip'
/a>")
+ user = model.create_user("user", password="123")
+ index_a = user.create_stage(index="A", bases=[])
+ index_b = user.create_stage(index="B", bases=["user/A"])
+ index_a.modify(bases=["user/B", pypistage.name])
+ rootpypi = model.getstage(pypistage.name)
+ assert rootpypi.list_projects() == [(rootpypi, {"someproject":
"someproject"})]
+ assert index_a.list_projects() == [
+ (index_a, set()),
+ (index_b, set()),
+ (rootpypi, {"someproject": "someproject"}),
+ ]
+ assert index_b.list_projects() == [
+ (index_b, set()),
+ (index_a, set()),
+ (rootpypi, {"someproject": "someproject"}),
+ ]
+
def test_inheritance_simple(self, pypistage, stage):
stage.modify(bases=("root/pypi",), mirror_whitelist=['someproject'])
pypistage.mock_simple("someproject", "<a href='someproject-1.0.zip'
/a>")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/test_streaming.py
new/devpi_server-6.19.3/test_devpi_server/test_streaming.py
--- old/devpi_server-6.19.2/test_devpi_server/test_streaming.py 2026-03-17
16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/test_streaming.py 2026-04-13
17:20:02.000000000 +0200
@@ -13,8 +13,10 @@
@pytest.fixture
-def host_port(primary_host_port):
- return primary_host_port
+def host_port(request, storage_info):
+ if "storage_with_filesystem" not in storage_info.get("_test_markers", []):
+ pytest.skip("The storage doesn't have marker
'storage_with_filesystem'.")
+ return request.getfixturevalue("primary_host_port")
@pytest.fixture
@@ -56,10 +58,17 @@
@pytest.mark.slow
@pytest.mark.parametrize("length,pkg_version,pkg_name", [
(None, '1.0', 'pkg1'), (False, '1.1', 'pkg2')])
- def test_streaming_download(self, content_digest, files_path, length,
pkg_version, pkg_name, server_url_session, simpypi, storage_info):
+ def test_streaming_download(
+ self,
+ content_digest,
+ files_path,
+ length,
+ pkg_version,
+ pkg_name,
+ server_url_session,
+ simpypi,
+ ):
from time import sleep
- if "storage_with_filesystem" not in storage_info.get('_test_markers',
[]):
- pytest.skip("The storage doesn't have marker
'storage_with_filesystem'.")
(content, digest) = content_digest
(url, s) = server_url_session
pkgzip = f"{pkg_name}-{pkg_version}.zip"
@@ -93,9 +102,16 @@
@pytest.mark.parametrize("size_factor,pkg_version,pkg_name", [
(2, '1.2', 'pkg3'), (0.5, '1.3', 'pkg4')])
- def test_streaming_differing_content_size(self, content_digest,
files_path, pkg_version, pkg_name, server_url_session, simpypi, size_factor,
storage_info):
- if "storage_with_filesystem" not in storage_info.get('_test_markers',
[]):
- pytest.skip("The storage doesn't have marker
'storage_with_filesystem'.")
+ def test_streaming_differing_content_size(
+ self,
+ content_digest,
+ files_path,
+ pkg_version,
+ pkg_name,
+ server_url_session,
+ simpypi,
+ size_factor,
+ ):
(content, digest) = content_digest
(url, s) = server_url_session
pkgzip = f"{pkg_name}-{pkg_version}.zip"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/test_streaming_nginx.py
new/devpi_server-6.19.3/test_devpi_server/test_streaming_nginx.py
--- old/devpi_server-6.19.2/test_devpi_server/test_streaming_nginx.py
2026-03-17 16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/test_streaming_nginx.py
2026-04-13 17:20:02.000000000 +0200
@@ -11,8 +11,10 @@
@pytest.fixture
-def host_port(nginx_host_port):
- return nginx_host_port
+def host_port(request, storage_info):
+ if "storage_with_filesystem" not in storage_info.get("_test_markers", []):
+ pytest.skip("The storage doesn't have marker
'storage_with_filesystem'.")
+ return request.getfixturevalue("nginx_host_port")
@pytest.fixture
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/test_streaming_replica.py
new/devpi_server-6.19.3/test_devpi_server/test_streaming_replica.py
--- old/devpi_server-6.19.2/test_devpi_server/test_streaming_replica.py
2026-03-17 16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/test_streaming_replica.py
2026-04-13 17:20:02.000000000 +0200
@@ -11,8 +11,10 @@
@pytest.fixture
-def host_port(replica_host_port):
- return replica_host_port
+def host_port(request, storage_info):
+ if "storage_with_filesystem" not in storage_info.get("_test_markers", []):
+ pytest.skip("The storage doesn't have marker
'storage_with_filesystem'.")
+ return request.getfixturevalue("replica_host_port")
@pytest.fixture
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/devpi_server-6.19.2/test_devpi_server/test_streaming_replica_nginx.py
new/devpi_server-6.19.3/test_devpi_server/test_streaming_replica_nginx.py
--- old/devpi_server-6.19.2/test_devpi_server/test_streaming_replica_nginx.py
2026-03-17 16:01:30.000000000 +0100
+++ new/devpi_server-6.19.3/test_devpi_server/test_streaming_replica_nginx.py
2026-04-13 17:20:02.000000000 +0200
@@ -11,8 +11,10 @@
@pytest.fixture
-def host_port(nginx_replica_host_port):
- return nginx_replica_host_port
+def host_port(request, storage_info):
+ if "storage_with_filesystem" not in storage_info.get("_test_markers", []):
+ pytest.skip("The storage doesn't have marker
'storage_with_filesystem'.")
+ return request.getfixturevalue("nginx_replica_host_port")
@pytest.fixture