Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-ansible-compat for 
openSUSE:Factory checked in at 2023-09-07 21:13:32
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-ansible-compat (Old)
 and      /work/SRC/openSUSE:Factory/.python-ansible-compat.new.1766 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-ansible-compat"

Thu Sep  7 21:13:32 2023 rev:19 rq:1109447 version:4.1.10

Changes:
--------
--- 
/work/SRC/openSUSE:Factory/python-ansible-compat/python-ansible-compat.changes  
    2023-07-24 18:26:12.422234921 +0200
+++ 
/work/SRC/openSUSE:Factory/.python-ansible-compat.new.1766/python-ansible-compat.changes
    2023-09-07 21:14:58.482096268 +0200
@@ -1,0 +2,49 @@
+Wed Sep  6 15:25:15 UTC 2023 - Johannes Kastl <ka...@b1-systems.de>
+
+- fix BuildRequires and Requires
+- ignore 4 new checks that need internet connectivity
+  * test_scan_sys_path[isolatedT-scanT-raises_not_foundT]
+  * test_scan_sys_path[isolatedT-scanF-raises_not_foundT]
+  * test_scan_sys_path[isolatedF-scanT-raises_not_foundF]
+  * test_scan_sys_path[isolatedF-scanF-raises_not_foundT]
+- update to 4.1.10:
+  * Bugfixes
+    - Catch empty collection lists (#332) @lod
+
+-------------------------------------------------------------------
+Wed Sep  6 14:55:28 UTC 2023 - Johannes Kastl <ka...@b1-systems.de>
+
+- update to 4.1.9:
+  * Bugfixes
+    - Automatically add --pre when installing collections from git
+      repositories @ssbarnea
+
+-------------------------------------------------------------------
+Wed Sep  6 06:01:26 UTC 2023 - Johannes Kastl <ka...@b1-systems.de>
+
+- update to 4.1.8:
+  * Bugfixes
+    - Revise site packages collection search test (#325) @cidrblock
+    - Add only those sys.paths which contain an ansible_collections
+      directory path (#322) @ajinkyau
+    - Allow git dependencies in galaxy.yml files (#321) @ssbarnea
+
+-------------------------------------------------------------------
+Wed Sep  6 05:44:29 UTC 2023 - Johannes Kastl <ka...@b1-systems.de>
+
+- update to 4.1.7:
+  * Bugfixes
+    - Add `sys.path` to collection paths (#318) @cidrblock
+
+-------------------------------------------------------------------
+Wed Sep  6 05:39:12 UTC 2023 - Johannes Kastl <ka...@b1-systems.de>
+
+- update to 4.1.6:
+  * Bugfixes
+    - Fix logic on prepare environment (#310) @ssbarnea
+    - Add smoke testing with ansible-lint (#312) @ssbarnea
+    - Adapt collection install test to pass with ansible-core
+      2.15.3 changes (#313) @ssbarnea
+    - Support meta main yaml extension (#304) @zhan9san
+
+-------------------------------------------------------------------

Old:
----
  ansible-compat-4.1.5.tar.gz

New:
----
  ansible-compat-4.1.10.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-ansible-compat.spec ++++++
--- /var/tmp/diff_new_pack.AyHXI3/_old  2023-09-07 21:14:59.498132589 +0200
+++ /var/tmp/diff_new_pack.AyHXI3/_new  2023-09-07 21:14:59.502132732 +0200
@@ -24,7 +24,7 @@
 %endif
 
 Name:           python-ansible-compat
-Version:        4.1.5
+Version:        4.1.10
 Release:        0
 Summary:        Compatibility shim for Ansible 2.9 and newer
 License:        MIT
@@ -36,18 +36,24 @@
 BuildRequires:  %{python_module wheel}
 BuildRequires:  python-rpm-macros
 # SECTION test
-BuildRequires:  %{python_module pytest}
+# https://github.com/ansible/ansible-compat/blob/main/pyproject.toml#L38
+BuildRequires:  ansible-core >= 2.12
 BuildRequires:  %{python_module PyYAML}
-BuildRequires:  %{python_module flaky}
 BuildRequires:  %{python_module jsonschema >= 4.17.3}
-BuildRequires:  %{python_module pytest-mock}
 BuildRequires:  %{python_module subprocess-tee >= 0.4.1}
-BuildRequires:  ansible-core >= 2.12
+# https://github.com/ansible/ansible-compat/blob/main/pyproject.toml#L56
+BuildRequires:  %{python_module pytest}
+BuildRequires:  %{python_module pytest-mock}
+BuildRequires:  %{python_module pytest-plus}
 # /SECTION
 BuildRequires:  fdupes
 BuildRequires:  python-rpm-generators
-%{?python_enable_dependency_generator}
+Requires:       ansible-core >= 2.12
+Requires:       python-PyYAML
+Requires:       python-jsonschema >= 4.17.3
+Requires:       python-packaging
 Requires:       python-subprocess-tee >= 0.4.1
+%{?python_enable_dependency_generator}
 BuildArch:      noarch
 %python_subpackages
 
@@ -66,7 +72,22 @@
 
 %check
 # excluding tests requiring internet connection
-%pytest -k 'not (test_runtime_example or test_require_collection_no_cache_dir 
or test_upgrade_collection or test_install_collection_dest or 
test_install_collection or test_require_collection or 
test_require_collection_wrong_version or test_prerun_reqs_v2 or 
test_prerun_reqs_v1 or test_prepare_environment_with_collections or 
test_runtime_require_module)' -W ignore:'There is no current event loop'
+IGNORED_CHECKS="test_install_collection"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_install_collection_dest"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_prepare_environment_with_collections"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_prerun_reqs_v1"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_prerun_reqs_v2"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_require_collection"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_require_collection_no_cache_dir"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_require_collection_wrong_version"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_runtime_example"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_runtime_require_module"
+IGNORED_CHECKS="${IGNORED_CHECKS} or 
test_scan_sys_path[isolatedF-scanF-raises_not_foundT]"
+IGNORED_CHECKS="${IGNORED_CHECKS} or 
test_scan_sys_path[isolatedF-scanT-raises_not_foundF]"
+IGNORED_CHECKS="${IGNORED_CHECKS} or 
test_scan_sys_path[isolatedT-scanF-raises_not_foundT]"
+IGNORED_CHECKS="${IGNORED_CHECKS} or 
test_scan_sys_path[isolatedT-scanT-raises_not_foundT]"
+IGNORED_CHECKS="${IGNORED_CHECKS} or test_upgrade_collection"
+%pytest -k "not (${IGNORED_CHECKS})"
 
 %files %{python_files}
 %{python_sitelib}/ansible_compat

++++++ _service ++++++
--- /var/tmp/diff_new_pack.AyHXI3/_old  2023-09-07 21:14:59.526133590 +0200
+++ /var/tmp/diff_new_pack.AyHXI3/_new  2023-09-07 21:14:59.530133733 +0200
@@ -1,5 +1,5 @@
 <services>
-  <service name="download_files" mode="disabled">
+  <service name="download_files" mode="manual">
   </service>
 </services>
 

++++++ ansible-compat-4.1.5.tar.gz -> ansible-compat-4.1.10.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/.github/workflows/release.yml 
new/ansible-compat-4.1.10/.github/workflows/release.yml
--- old/ansible-compat-4.1.5/.github/workflows/release.yml      2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/.github/workflows/release.yml     2023-09-06 
14:55:41.000000000 +0200
@@ -30,7 +30,7 @@
       - name: Install tox
         run: python3 -m pip install --user "tox>=4.0.0"
       - name: Check out src from Git
-        uses: actions/checkout@v3
+        uses: actions/checkout@v4
         with:
           fetch-depth: 0 # needed by setuptools-scm
       - name: Build dists
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/.github/workflows/tox.yml 
new/ansible-compat-4.1.10/.github/workflows/tox.yml
--- old/ansible-compat-4.1.5/.github/workflows/tox.yml  2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/.github/workflows/tox.yml 2023-09-06 
14:55:41.000000000 +0200
@@ -33,6 +33,7 @@
             py39-ansible214
             py39-ansible215
             py311-devel
+            smoke
           platforms: linux,macos
           macos: minmax
   build:
@@ -47,7 +48,7 @@
 
     steps:
       - name: Check out src from Git
-        uses: actions/checkout@v3
+        uses: actions/checkout@v4
         with:
           fetch-depth: 0 # needed by setuptools-scm
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/.pre-commit-config.yaml 
new/ansible-compat-4.1.10/.pre-commit-config.yaml
--- old/ansible-compat-4.1.5/.pre-commit-config.yaml    2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/.pre-commit-config.yaml   2023-09-06 
14:55:41.000000000 +0200
@@ -20,14 +20,14 @@
     test/assets/.*
   )$
 repos:
-  - repo: https://github.com/charliermarsh/ruff-pre-commit
-    rev: "v0.0.277"
+  - repo: https://github.com/astral-sh/ruff-pre-commit
+    rev: "v0.0.287"
     hooks:
       - id: ruff
         args: [--fix, --exit-non-zero-on-fix]
   - repo: https://github.com/pre-commit/mirrors-prettier
     # keep it before yamllint
-    rev: "v3.0.0"
+    rev: "v3.0.3"
     hooks:
       - id: prettier
         additional_dependencies:
@@ -62,12 +62,12 @@
         types: [file, yaml]
         entry: yamllint --strict
   - repo: https://github.com/psf/black
-    rev: 23.3.0
+    rev: 23.7.0
     hooks:
       - id: black
         language_version: python3
   - repo: https://github.com/pre-commit/mirrors-mypy
-    rev: v1.4.1
+    rev: v1.5.1
     hooks:
       - id: mypy
         # empty args needed in order to match mypy cli behavior
@@ -84,7 +84,7 @@
           - types-pkg_resources
           - types-jsonschema>=4.4.9
   - repo: https://github.com/pycqa/pylint
-    rev: v3.0.0a6
+    rev: v3.0.0a7
     hooks:
       - id: pylint
         additional_dependencies:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/PKG-INFO 
new/ansible-compat-4.1.10/PKG-INFO
--- old/ansible-compat-4.1.5/PKG-INFO   2023-07-21 12:58:43.393725200 +0200
+++ new/ansible-compat-4.1.10/PKG-INFO  2023-09-06 14:56:06.785466000 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 2.1
 Name: ansible-compat
-Version: 4.1.5
+Version: 4.1.10
 Summary: Ansible compatibility goodies
 Author-email: Sorin Sbarnea <ssbar...@redhat.com>
 Maintainer-email: Sorin Sbarnea <ssbar...@redhat.com>
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/ansible-compat-4.1.5/examples/reqs_v2/requirements.yml 
new/ansible-compat-4.1.10/examples/reqs_v2/requirements.yml
--- old/ansible-compat-4.1.5/examples/reqs_v2/requirements.yml  2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/examples/reqs_v2/requirements.yml 2023-09-06 
14:55:41.000000000 +0200
@@ -9,3 +9,13 @@
     name: geerlingguy.mysql
 collections:
   - name: community-molecule-0.1.0.tar.gz
+  # Also needed for testing purposes as this should trigger addition of --pre
+  # argument as this is required due to
+  # https://github.com/ansible/ansible-lint/issues/3686
+  # https://github.com/ansible/ansible/issues/79109
+  - name: https://github.com/ansible-collections/amazon.aws.git
+    type: git
+    version: main
+  - name: https://github.com/ansible-collections/community.aws.git
+    type: git
+    version: main
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/pyproject.toml 
new/ansible-compat-4.1.10/pyproject.toml
--- old/ansible-compat-4.1.5/pyproject.toml     2023-07-21 12:58:17.000000000 
+0200
+++ new/ansible-compat-4.1.10/pyproject.toml    2023-09-06 14:55:41.000000000 
+0200
@@ -120,6 +120,7 @@
 [tool.pytest.ini_options]
 # ensure we treat warnings as error
 filterwarnings = ["error"]
+testpaths = ["test"]
 
 [tool.ruff]
 select = ["ALL"]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/readthedocs.yml 
new/ansible-compat-4.1.10/readthedocs.yml
--- old/ansible-compat-4.1.5/readthedocs.yml    2023-07-21 12:58:17.000000000 
+0200
+++ new/ansible-compat-4.1.10/readthedocs.yml   2023-09-06 14:55:41.000000000 
+0200
@@ -13,7 +13,6 @@
     python: "3.11"
 
 python:
-  system_packages: false
   install:
     - method: pip
       path: .
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/requirements.txt 
new/ansible-compat-4.1.10/requirements.txt
--- old/ansible-compat-4.1.5/requirements.txt   2023-07-21 12:58:17.000000000 
+0200
+++ new/ansible-compat-4.1.10/requirements.txt  2023-09-06 14:55:41.000000000 
+0200
@@ -4,25 +4,27 @@
 #
 #    pip-compile --extra=docs --extra=test --output-file=requirements.txt 
--strip-extras --unsafe-package=ansible-core --unsafe-package=resolvelib 
--unsafe-package=typing_extensions pyproject.toml
 #
-argparse-manpage==4.2
+argparse-manpage==4.3
     # via ansible-compat (pyproject.toml)
 attrs==23.1.0
-    # via jsonschema
-beautifulsoup4==4.12.1
+    # via
+    #   jsonschema
+    #   referencing
+beautifulsoup4==4.12.2
     # via
     #   mkdocs-ansible
     #   mkdocs-htmlproofer-plugin
-black==23.3.0
+black==23.7.0
     # via ansible-compat (pyproject.toml)
 build==0.10.0
     # via pip-tools
-cairocffi==1.5.0
+cairocffi==1.5.1
     # via
     #   cairosvg
     #   mkdocs-ansible
 cairosvg==2.7.0
     # via mkdocs-ansible
-certifi==2022.12.7
+certifi==2023.5.7
     # via
     #   mkdocs-ansible
     #   requests
@@ -46,9 +48,9 @@
     #   griffe
     #   mkdocs-ansible
     #   mkdocs-material
-coverage==7.2.5
+coverage==7.3.0
     # via ansible-compat (pyproject.toml)
-cryptography==40.0.2
+cryptography==41.0.3
     # via ansible-core
 csscompressor==0.9.5
     # via
@@ -62,13 +64,13 @@
     # via
     #   cairosvg
     #   mkdocs-ansible
-exceptiongroup==1.1.1
+exceptiongroup==1.1.3
     # via pytest
 ghp-import==2.1.0
     # via
     #   mkdocs
     #   mkdocs-ansible
-griffe==0.26.0
+griffe==0.29.0
     # via
     #   mkdocs-ansible
     #   mkdocstrings-python
@@ -80,11 +82,12 @@
     # via
     #   mkdocs-ansible
     #   requests
-importlib-metadata==6.1.0
+importlib-metadata==6.6.0
     # via
     #   markdown
     #   mkdocs
     #   mkdocs-ansible
+    #   mkdocstrings
 importlib-resources==5.0.7
     # via ansible-core
 iniconfig==2.0.0
@@ -100,8 +103,10 @@
     # via
     #   mkdocs-ansible
     #   mkdocs-minify-plugin
-jsonschema==4.17.3
+jsonschema==4.19.0
     # via ansible-compat (pyproject.toml)
+jsonschema-specifications==2023.7.1
+    # via jsonschema
 markdown==3.3.7
     # via
     #   markdown-include
@@ -112,7 +117,7 @@
     #   mkdocs-material
     #   mkdocstrings
     #   pymdown-extensions
-markdown-exec==1.4.0
+markdown-exec==1.6.0
     # via mkdocs-ansible
 markdown-include==0.8.1
     # via mkdocs-ansible
@@ -125,7 +130,7 @@
     # via
     #   mkdocs
     #   mkdocs-ansible
-mkdocs==1.4.2
+mkdocs==1.4.3
     # via
     #   mkdocs-ansible
     #   mkdocs-autorefs
@@ -135,17 +140,17 @@
     #   mkdocs-minify-plugin
     #   mkdocs-monorepo-plugin
     #   mkdocstrings
-mkdocs-ansible==0.1.4
+mkdocs-ansible==0.1.6
     # via ansible-compat (pyproject.toml)
 mkdocs-autorefs==0.4.1
     # via
     #   mkdocs-ansible
     #   mkdocstrings
-mkdocs-gen-files==0.4.0
+mkdocs-gen-files==0.5.0
     # via mkdocs-ansible
-mkdocs-htmlproofer-plugin==0.11.0
+mkdocs-htmlproofer-plugin==0.13.1
     # via mkdocs-ansible
-mkdocs-material==9.1.5
+mkdocs-material==9.1.15
     # via mkdocs-ansible
 mkdocs-material-extensions==1.1.1
     # via
@@ -153,17 +158,17 @@
     #   mkdocs-material
 mkdocs-minify-plugin==0.6.4
     # via mkdocs-ansible
-mkdocs-monorepo-plugin==1.0.4
+mkdocs-monorepo-plugin==1.0.5
     # via mkdocs-ansible
-mkdocstrings==0.21.2
+mkdocstrings==0.22.0
     # via
     #   mkdocs-ansible
     #   mkdocstrings-python
-mkdocstrings-python==0.9.0
+mkdocstrings-python==1.1.0
     # via mkdocs-ansible
 mypy-extensions==1.0.0
     # via black
-packaging==23.0
+packaging==23.1
     # via
     #   ansible-compat (pyproject.toml)
     #   ansible-core
@@ -172,31 +177,31 @@
     #   mkdocs
     #   mkdocs-ansible
     #   pytest
-pathspec==0.11.1
+pathspec==0.11.2
     # via black
 pillow==9.5.0
     # via
     #   cairosvg
     #   mkdocs-ansible
-pip==23.1.2
+pip==23.2.1
     # via pip-tools
-pip-tools==6.13.0
+pip-tools==7.3.0
     # via ansible-compat (pyproject.toml)
-pipdeptree==2.7.0
+pipdeptree==2.7.1
     # via mkdocs-ansible
-platformdirs==3.5.1
+platformdirs==3.10.0
     # via black
-pluggy==1.0.0
+pluggy==1.2.0
     # via pytest
 pycparser==2.21
     # via
     #   cffi
     #   mkdocs-ansible
-pygments==2.14.0
+pygments==2.15.1
     # via
     #   mkdocs-ansible
     #   mkdocs-material
-pymdown-extensions==9.10
+pymdown-extensions==10.0.1
     # via
     #   markdown-exec
     #   mkdocs-ansible
@@ -204,14 +209,12 @@
     #   mkdocstrings
 pyproject-hooks==1.0.0
     # via build
-pyrsistent==0.19.3
-    # via jsonschema
-pytest==7.3.1
+pytest==7.4.0
     # via
     #   ansible-compat (pyproject.toml)
     #   pytest-mock
     #   pytest-plus
-pytest-mock==3.10.0
+pytest-mock==3.11.1
     # via ansible-compat (pyproject.toml)
 pytest-plus==0.4.0
     # via ansible-compat (pyproject.toml)
@@ -235,22 +238,30 @@
     # via
     #   mkdocs
     #   mkdocs-ansible
-regex==2023.3.23
+referencing==0.30.2
+    # via
+    #   jsonschema
+    #   jsonschema-specifications
+regex==2023.5.5
     # via
     #   mkdocs-ansible
     #   mkdocs-material
-requests==2.28.2
+requests==2.31.0
     # via
     #   mkdocs-ansible
     #   mkdocs-htmlproofer-plugin
     #   mkdocs-material
-setuptools==67.7.2
+rpds-py==0.9.2
+    # via
+    #   jsonschema
+    #   referencing
+setuptools==68.1.2
     # via pip-tools
 six==1.16.0
     # via
     #   mkdocs-ansible
     #   python-dateutil
-soupsieve==2.4
+soupsieve==2.4.1
     # via
     #   beautifulsoup4
     #   mkdocs-ansible
@@ -265,21 +276,21 @@
     #   cairosvg
     #   cssselect2
     #   mkdocs-ansible
-toml==0.10.2
-    # via argparse-manpage
 tomli==2.0.1
     # via
+    #   argparse-manpage
     #   black
     #   build
+    #   pip-tools
     #   pyproject-hooks
     #   pytest
-typing-extensions==4.5.0 ; python_version < "3.10"
+typing-extensions==4.6.2 ; python_version < "3.10"
     # via
     #   ansible-compat (pyproject.toml)
     #   black
     #   mkdocs-ansible
     #   mkdocstrings
-urllib3==1.26.15
+urllib3==2.0.2
     # via
     #   mkdocs-ansible
     #   requests
@@ -292,7 +303,7 @@
     #   cssselect2
     #   mkdocs-ansible
     #   tinycss2
-wheel==0.40.0
+wheel==0.41.1
     # via pip-tools
 zipp==3.15.0
     # via
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/src/ansible_compat/_version.py 
new/ansible-compat-4.1.10/src/ansible_compat/_version.py
--- old/ansible-compat-4.1.5/src/ansible_compat/_version.py     2023-07-21 
12:58:43.000000000 +0200
+++ new/ansible-compat-4.1.10/src/ansible_compat/_version.py    2023-09-06 
14:56:06.000000000 +0200
@@ -1,4 +1,4 @@
 # file generated by setuptools_scm
 # don't change, don't track in version control
-__version__ = version = '4.1.5'
-__version_tuple__ = version_tuple = (4, 1, 5)
+__version__ = version = '4.1.10'
+__version_tuple__ = version_tuple = (4, 1, 10)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/src/ansible_compat/config.py 
new/ansible-compat-4.1.10/src/ansible_compat/config.py
--- old/ansible-compat-4.1.5/src/ansible_compat/config.py       2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/src/ansible_compat/config.py      2023-09-06 
14:55:41.000000000 +0200
@@ -1,11 +1,13 @@
 """Store configuration options as a singleton."""
+from __future__ import annotations
+
 import ast
 import copy
 import os
 import re
 import subprocess
 from collections import UserDict
-from typing import Literal, Optional, Union
+from typing import Literal
 
 from packaging.version import Version
 
@@ -79,9 +81,9 @@
     action_warnings: bool = True
     agnostic_become_prompt: bool = True
     allow_world_readable_tmpfiles: bool = False
-    ansible_connection_path: Optional[str] = None
+    ansible_connection_path: str | None = None
     ansible_cow_acceptlist: list[str]
-    ansible_cow_path: Optional[str] = None
+    ansible_cow_path: str | None = None
     ansible_cow_selection: str = "default"
     ansible_force_color: bool = False
     ansible_nocolor: bool = False
@@ -94,16 +96,12 @@
         "/usr/share/ansible/plugins/become",
     ]
     cache_plugin: str = "memory"
-    cache_plugin_connection: Optional[str] = None
+    cache_plugin_connection: str | None = None
     cache_plugin_prefix: str = "ansible_facts"
     cache_plugin_timeout: int = 86400
     callable_accept_list: list[str] = []
     callbacks_enabled: list[str] = []
-    collections_on_ansible_version_mismatch: Union[
-        Literal["warning"],
-        Literal["warning"],
-        Literal["ignore"],
-    ] = "warning"
+    collections_on_ansible_version_mismatch: Literal["warning", "ignore"] = 
"warning"
     collections_paths: list[str] = [
         "~/.ansible/collections",
         "/usr/share/ansible/collections",
@@ -127,7 +125,7 @@
     conditional_bare_vars: bool = False
     connection_facts_modules: dict[str, str]
     controller_python_warning: bool = True
-    coverage_remote_output: Optional[str]
+    coverage_remote_output: str | None
     coverage_remote_paths: list[str]
     default_action_plugin_path: list[str] = [
         "~/.ansible/plugins/action",
@@ -138,7 +136,7 @@
     default_ask_vault_pass: bool = False
     default_become: bool = False
     default_become_ask_pass: bool = False
-    default_become_exe: Optional[str] = None
+    default_become_exe: str | None = None
     default_become_flags: str
     default_become_method: str = "sudo"
     default_become_user: str = "root"
@@ -160,18 +158,14 @@
     ]
     default_debug: bool = False
     default_executable: str = "/bin/sh"
-    default_fact_path: Optional[str] = None
+    default_fact_path: str | None = None
     default_filter_plugin_path: list[str] = [
         "~/.ansible/plugins/filter",
         "/usr/share/ansible/plugins/filter",
     ]
     default_force_handlers: bool = False
     default_forks: int = 5
-    default_gathering: Union[
-        Literal["smart"],
-        Literal["explicit"],
-        Literal["implicit"],
-    ] = "smart"
+    default_gathering: Literal["smart", "explicit", "implicit"] = "smart"
     default_gather_subset: list[str] = ["all"]
     default_gather_timeout: int = 10
     default_handler_includes_static: bool = False
@@ -193,7 +187,7 @@
     default_load_callback_plugins: bool = False
     default_local_tmp: str = "~/.ansible/tmp"
     default_log_filter: list[str] = []
-    default_log_path: Optional[str] = None
+    default_log_path: str | None = None
     default_lookup_lugin_path: list[str] = [
         "~/.ansible/plugins/lookup",
         "/usr/share/ansible/plugins/lookup",
@@ -216,12 +210,12 @@
     ]
     default_no_log: bool = False
     default_no_target_syslog: bool = False
-    default_null_representation: Optional[str] = None
+    default_null_representation: str | None = None
     default_poll_interval: int = 15
-    default_private_key_file: Optional[str] = None
+    default_private_key_file: str | None = None
     default_private_role_vars: bool = False
-    default_remote_port: Optional[str] = None
-    default_remote_user: Optional[str] = None
+    default_remote_port: str | None = None
+    default_remote_user: str | None = None
     default_roles_path: list[str] = [
         "~/.ansible/roles",
         "/usr/share/ansible/roles",
@@ -259,11 +253,11 @@
         "~/.ansible/plugins/vars",
         "/usr/share/ansible/plugins/vars",
     ]
-    default_vault_encrypt_identity: Optional[str] = None
+    default_vault_encrypt_identity: str | None = None
     default_vault_identity: str = "default"
     default_vault_identity_list: list[str] = []
     default_vault_id_match: bool = False
-    default_vault_password_file: Optional[str] = None
+    default_vault_password_file: str | None = None
     default_verbosity: int = 0
     deprecation_warnings: bool = False
     devel_warning: bool = True
@@ -276,28 +270,20 @@
         "~/.ansible/plugins/doc_fragments",
         "/usr/share/ansible/plugins/doc_fragments",
     ]
-    duplicate_yaml_dict_key: Union[
-        Literal["warn"],
-        Literal["error"],
-        Literal["ignore"],
-    ] = "warn"
+    duplicate_yaml_dict_key: Literal["warn", "error", "ignore"] = "warn"
     enable_task_debugger: bool = False
     error_on_missing_handler: bool = True
     facts_modules: list[str] = ["smart"]
     galaxy_cache_dir: str = "~/.ansible/galaxy_cache"
-    galaxy_display_progress: Optional[str] = None
+    galaxy_display_progress: str | None = None
     galaxy_ignore_certs: bool = False
-    galaxy_role_skeleton: Optional[str] = None
+    galaxy_role_skeleton: str | None = None
     galaxy_role_skeleton_ignore: list[str] = ["^.git$", "^.*/.git_keep$"]
     galaxy_server: str = "https://galaxy.ansible.com";
-    galaxy_server_list: Optional[str] = None
+    galaxy_server_list: str | None = None
     galaxy_token_path: str = "~/.ansible/galaxy_token"
     host_key_checking: bool = True
-    host_pattern_mismatch: Union[
-        Literal["warning"],
-        Literal["error"],
-        Literal["ignore"],
-    ] = "warning"
+    host_pattern_mismatch: Literal["warning", "error", "ignore"] = "warning"
     inject_facts_as_vars: bool = True
     interpreter_python: str = "auto_legacy"
     interpreter_python_distro_map: dict[str, str]
@@ -305,8 +291,8 @@
     invalid_task_attribute_failed: bool = True
     inventory_any_unparsed_is_failed: bool = False
     inventory_cache_enabled: bool = False
-    inventory_cache_plugin: Optional[str] = None
-    inventory_cache_plugin_connection: Optional[str] = None
+    inventory_cache_plugin: str | None = None
+    inventory_cache_plugin_connection: str | None = None
     inventory_cache_plugin_prefix: str = "ansible_facts"
     inventory_cache_timeout: int = 3600
     inventory_enabled: list[str] = [
@@ -324,7 +310,7 @@
     localhost_warning: bool = True
     max_file_size_for_diff: int = 104448
     module_ignore_exts: str
-    netconf_ssh_config: Optional[str] = None
+    netconf_ssh_config: str | None = None
     network_group_modules: list[str] = [
         "eos",
         "nxos",
@@ -356,19 +342,15 @@
     persistent_connect_retry_timeout: int = 15
     persistent_connect_timeout: int = 30
     persistent_control_path_dir: str = "~/.ansible/pc"
-    playbook_dir: Optional[str]
-    playbook_vars_root: Union[Literal["top"], Literal["bottom"], 
Literal["all"]] = "top"
-    plugin_filters_cfg: Optional[str] = None
+    playbook_dir: str | None
+    playbook_vars_root: Literal["top", "bottom", "all"] = "top"
+    plugin_filters_cfg: str | None = None
     python_module_rlimit_nofile: int = 0
     retry_files_enabled: bool = False
-    retry_files_save_path: Optional[str] = None
+    retry_files_save_path: str | None = None
     run_vars_plugins: str = "demand"
     show_custom_stats: bool = False
-    string_conversion_action: Union[
-        Literal["warn"],
-        Literal["error"],
-        Literal["ignore"],
-    ] = "warn"
+    string_conversion_action: Literal["warn", "error", "ignore"] = "warn"
     string_type_filters: list[str] = [
         "string",
         "to_json",
@@ -383,11 +365,11 @@
     tags_skip: list[str] = []
     task_debugger_ignore_errors: bool = True
     task_timeout: int = 0
-    transform_invalid_group_chars: Union[
-        Literal["always"],
-        Literal["never"],
-        Literal["ignore"],
-        Literal["silently"],
+    transform_invalid_group_chars: Literal[
+        "always",
+        "never",
+        "ignore",
+        "silently",
     ] = "never"
     use_persistent_connections: bool = False
     variable_plugins_enabled: list[str] = ["host_group_vars"]
@@ -407,8 +389,8 @@
 
     def __init__(
         self,
-        config_dump: Optional[str] = None,
-        data: Optional[dict[str, object]] = None,
+        config_dump: str | None = None,
+        data: dict[str, object] | None = None,
     ) -> None:
         """Load config dictionary."""
         super().__init__()
@@ -461,11 +443,11 @@
         """Allow access to config options using indexing."""
         return super().__getitem__(name.upper())
 
-    def __copy__(self) -> "AnsibleConfig":
+    def __copy__(self) -> AnsibleConfig:
         """Allow users to run copy on Config."""
         return AnsibleConfig(data=self.data)
 
-    def __deepcopy__(self, memo: object) -> "AnsibleConfig":
+    def __deepcopy__(self, memo: object) -> AnsibleConfig:
         """Allow users to run deeepcopy on Config."""
         return AnsibleConfig(data=self.data)
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/src/ansible_compat/constants.py 
new/ansible-compat-4.1.10/src/ansible_compat/constants.py
--- old/ansible-compat-4.1.5/src/ansible_compat/constants.py    2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/src/ansible_compat/constants.py   2023-09-06 
14:55:41.000000000 +0200
@@ -1,6 +1,8 @@
 """Constants used by ansible_compat."""
 
+from pathlib import Path
 
+META_MAIN = (Path("meta") / Path("main.yml"), Path("meta") / Path("main.yaml"))
 REQUIREMENT_LOCATIONS = [
     "requirements.yml",
     "roles/requirements.yml",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/src/ansible_compat/errors.py 
new/ansible-compat-4.1.10/src/ansible_compat/errors.py
--- old/ansible-compat-4.1.5/src/ansible_compat/errors.py       2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/src/ansible_compat/errors.py      2023-09-06 
14:55:41.000000000 +0200
@@ -1,5 +1,7 @@
 """Module to deal with errors."""
-from typing import TYPE_CHECKING, Any, Optional
+from __future__ import annotations
+
+from typing import TYPE_CHECKING, Any
 
 from ansible_compat.constants import ANSIBLE_MISSING_RC, 
INVALID_PREREQUISITES_RC
 
@@ -14,8 +16,8 @@
 
     def __init__(
         self,
-        message: Optional[str] = None,
-        proc: Optional[Any] = None,
+        message: str | None = None,
+        proc: CompletedProcess[Any] | None = None,
     ) -> None:
         """Construct generic library exception."""
         super().__init__(message)
@@ -25,7 +27,7 @@
 class AnsibleCommandError(RuntimeError):
     """Exception running an Ansible command."""
 
-    def __init__(self, proc: "CompletedProcess[Any]") -> None:
+    def __init__(self, proc: CompletedProcess[Any]) -> None:
         """Construct an exception given a completed process."""
         message = (
             f"Got {proc.returncode} exit code while running: {' 
'.join(proc.args)}"
@@ -41,8 +43,8 @@
 
     def __init__(
         self,
-        message: Optional[str] = "Unable to find a working copy of ansible 
executable.",
-        proc: Optional[Any] = None,
+        message: str | None = "Unable to find a working copy of ansible 
executable.",
+        proc: CompletedProcess[Any] | None = None,
     ) -> None:
         """."""
         super().__init__(message)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/src/ansible_compat/runtime.py 
new/ansible-compat-4.1.10/src/ansible_compat/runtime.py
--- old/ansible-compat-4.1.5/src/ansible_compat/runtime.py      2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/src/ansible_compat/runtime.py     2023-09-06 
14:55:41.000000000 +0200
@@ -2,7 +2,6 @@
 from __future__ import annotations
 
 import contextlib
-import fnmatch
 import importlib
 import json
 import logging
@@ -10,6 +9,7 @@
 import re
 import shutil
 import subprocess
+import sys
 import tempfile
 import warnings
 from collections import OrderedDict
@@ -27,6 +27,7 @@
     parse_ansible_version,
 )
 from ansible_compat.constants import (
+    META_MAIN,
     MSG_INVALID_FQRL,
     RC_ANSIBLE_OPTIONS_ERROR,
     REQUIREMENT_LOCATIONS,
@@ -47,9 +48,11 @@
 else:
     CompletedProcess = subprocess.CompletedProcess
 
+
 _logger = logging.getLogger(__name__)
 # regex to extract the first version from a collection range specifier
 version_re = re.compile(":[>=<]*([^,]*)")
+namespace_re = re.compile("^[a-z][a-z0-9_]+$")
 
 
 class AnsibleWarning(Warning):
@@ -203,6 +206,9 @@
             self.cache_dir = get_cache_dir(self.project_dir)
         self.config = AnsibleConfig()
 
+        # Add the sys.path to the collection paths if not isolated
+        self._add_sys_path_to_collection_paths()
+
         if not self.version_in_range(lower=min_required_version):
             msg = f"Found incompatible version of ansible runtime 
{self.version}, instead of {min_required_version} or newer."
             raise RuntimeError(msg)
@@ -230,6 +236,18 @@
         # Monkey patch ansible warning in order to use warnings module.
         Display.warning = warning
 
+    def _add_sys_path_to_collection_paths(self) -> None:
+        """Add the sys.path to the collection paths."""
+        if not self.isolated and self.config.collections_scan_sys_path:
+            for path in sys.path:
+                if (
+                    path not in self.config.collections_paths
+                    and (Path(path) / "ansible_collections").is_dir()
+                ):
+                    self.config.collections_paths.append(  # pylint: 
disable=E1101
+                        path,
+                    )
+
     def load_collections(self) -> None:
         """Load collection data."""
         self.collections = OrderedDict()
@@ -402,7 +420,9 @@
     ) -> None:
         """Install an Ansible collection.
 
-        Can accept version constraints like 'foo.bar:>=1.2.3'
+        Can accept arguments like:
+            'foo.bar:>=1.2.3'
+            'git+https://github.com/ansible-collections/ansible.posix.git,main'
         """
         cmd = [
             "ansible-galaxy",
@@ -413,11 +433,18 @@
         if force:
             cmd.append("--force")
 
+        if isinstance(collection, Path):
+            collection = str(collection)
         # As ansible-galaxy install is not able to automatically determine
         # if the range requires a pre-release, we need to manuall add the --pre
         # flag when needed.
-        matches = version_re.search(str(collection))
-        if matches and CollectionVersion(matches[1]).is_prerelease:
+        matches = version_re.search(collection)
+
+        if (
+            not is_url(collection)
+            and matches
+            and CollectionVersion(matches[1]).is_prerelease
+        ):
             cmd.append("--pre")
 
         cpaths: list[str] = self.config.collections_paths
@@ -521,13 +548,20 @@
                     raise AnsibleCommandError(result)
 
         # Run galaxy collection install works on v2 requirements.yml
-        if "collections" in reqs_yaml:
+        if "collections" in reqs_yaml and reqs_yaml["collections"] is not None:
             cmd = [
                 "ansible-galaxy",
                 "collection",
                 "install",
                 "-v",
             ]
+            for collection in reqs_yaml["collections"]:
+                if isinstance(collection, dict) and collection.get("type", "") 
== "git":
+                    _logger.info(
+                        "Adding '--pre' to ansible-galaxy collection install 
because we detected one collection being sourced from git.",
+                    )
+                    cmd.append("--pre")
+                    break
             if offline:
                 _logger.warning(
                     "Skipped installing collection dependencies due to running 
in offline mode.",
@@ -553,19 +587,6 @@
                     _logger.error(result.stderr)
                     raise AnsibleCommandError(result)
 
-    def search_galaxy_paths(self, search_dir: Path, depth: int = 0) -> 
list[str]:
-        """Search for galaxy paths (only one level deep)."""
-        galaxy_paths: list[str] = []
-        for file in os.listdir(search_dir):
-            file_path = Path(file)
-            if file_path.is_dir() and depth < 1:
-                galaxy_paths.extend(self.search_galaxy_paths(file_path, 1))
-            elif fnmatch.fnmatch(file, "galaxy.yml"):
-                galaxy_paths.append(str(search_dir / file))
-        if depth == 0 and not galaxy_paths:
-            return ["galaxy.yml"]
-        return galaxy_paths
-
     def prepare_environment(  # noqa: C901
         self,
         required_collections: dict[str, str] | None = None,
@@ -587,7 +608,13 @@
         for req_file in REQUIREMENT_LOCATIONS:
             self.install_requirements(Path(req_file), retry=retry, 
offline=offline)
 
-        for gpath in self.search_galaxy_paths(self.project_dir):
+        self._prepare_ansible_paths()
+
+        if not install_local:
+            return
+
+        for gpath in search_galaxy_paths(self.project_dir):
+            # processing all found galaxy.yml files
             galaxy_path = Path(gpath)
             if galaxy_path.exists():
                 data = yaml_from_file(galaxy_path)
@@ -599,63 +626,58 @@
                             required_version,
                         )
                         self.install_collection(
-                            f"{name}:{required_version}",
+                            f"{name}{',' if is_url(name) else 
':'}{required_version}",
                             destination=destination,
                         )
 
-            if self.cache_dir:
-                destination = self.cache_dir / "collections"
-            for name, min_version in required_collections.items():
-                self.install_collection(
-                    f"{name}:>={min_version}",
-                    destination=destination,
-                )
-
-            self._prepare_ansible_paths()
-
-            if not install_local:
-                return
+        if self.cache_dir:
+            destination = self.cache_dir / "collections"
+        for name, min_version in required_collections.items():
+            self.install_collection(
+                f"{name}:>={min_version}",
+                destination=destination,
+            )
 
-            if galaxy_path.exists():
-                if destination:
-                    # while function can return None, that would not break the 
logic
-                    colpath = Path(
-                        
f"{destination}/ansible_collections/{colpath_from_path(Path.cwd())}",
-                    )
-                    if colpath.is_symlink():
-                        if os.path.realpath(colpath) == Path.cwd():
-                            _logger.warning(
-                                "Found symlinked collection, skipping its 
installation.",
-                            )
-                            return
+        if Path("galaxy.yml").exists():
+            if destination:
+                # while function can return None, that would not break the 
logic
+                colpath = Path(
+                    
f"{destination}/ansible_collections/{colpath_from_path(Path.cwd())}",
+                )
+                if colpath.is_symlink():
+                    if os.path.realpath(colpath) == Path.cwd():
                         _logger.warning(
-                            "Collection is symlinked, but not pointing to %s 
directory, so we will remove it.",
-                            Path.cwd(),
+                            "Found symlinked collection, skipping its 
installation.",
                         )
-                        colpath.unlink()
+                        return
+                    _logger.warning(
+                        "Collection is symlinked, but not pointing to %s 
directory, so we will remove it.",
+                        Path.cwd(),
+                    )
+                    colpath.unlink()
 
-                # molecule scenario within a collection
-                self.install_collection_from_disk(
-                    galaxy_path.parent,
-                    destination=destination,
-                )
-            elif (
-                Path().resolve().parent.name == "roles"
-                and Path("../../galaxy.yml").exists()
-            ):
-                # molecule scenario located within roles/<role-name>/molecule 
inside
-                # a collection
-                self.install_collection_from_disk(
-                    Path("../.."),
-                    destination=destination,
-                )
-            else:
-                # no collection, try to recognize and install a standalone role
-                self._install_galaxy_role(
-                    self.project_dir,
-                    role_name_check=role_name_check,
-                    ignore_errors=True,
-                )
+            # molecule scenario within a collection
+            self.install_collection_from_disk(
+                galaxy_path.parent,
+                destination=destination,
+            )
+        elif (
+            Path().resolve().parent.name == "roles"
+            and Path("../../galaxy.yml").exists()
+        ):
+            # molecule scenario located within roles/<role-name>/molecule 
inside
+            # a collection
+            self.install_collection_from_disk(
+                Path("../.."),
+                destination=destination,
+            )
+        else:
+            # no collection, try to recognize and install a standalone role
+            self._install_galaxy_role(
+                self.project_dir,
+                role_name_check=role_name_check,
+                ignore_errors=True,
+            )
         # reload collections
         self.load_collections()
 
@@ -665,11 +687,16 @@
         version: str | None = None,
         *,
         install: bool = True,
-    ) -> None:
+    ) -> tuple[CollectionVersion, Path]:
         """Check if a minimal collection version is present or exits.
 
         In the future this method may attempt to install a missing or outdated
         collection before failing.
+
+        :param name: collection name
+        :param version: minimal version required
+        :param install: if True, attempt to install a missing collection
+        :returns: tuple of (found_version, collection_path)
         """
         try:
             ns, coll = name.split(".", 1)
@@ -714,15 +741,19 @@
                             msg = f"Found {name} collection {found_version} 
but {version} or newer is required."
                             _logger.fatal(msg)
                             raise InvalidPrerequisiteError(msg)
+                    return found_version, collpath.resolve()
                 break
         else:
             if install:
                 self.install_collection(f"{name}:>={version}" if version else 
name)
-                self.require_collection(name=name, version=version, 
install=False)
-            else:
-                msg = f"Collection '{name}' not found in '{paths}'"
-                _logger.fatal(msg)
-                raise InvalidPrerequisiteError(msg)
+                return self.require_collection(
+                    name=name,
+                    version=version,
+                    install=False,
+                )
+            msg = f"Collection '{name}' not found in '{paths}'"
+            _logger.fatal(msg)
+            raise InvalidPrerequisiteError(msg)
 
     def _prepare_ansible_paths(self) -> None:
         """Configure Ansible environment variables."""
@@ -802,13 +833,17 @@
         """
         yaml = None
         galaxy_info = {}
-        meta_filename = Path(project_dir) / "meta" / "main.yml"
 
-        if not meta_filename.exists():
+        for meta_main in META_MAIN:
+            meta_filename = Path(project_dir) / meta_main
+
+            if meta_filename.exists():
+                break
+        else:
             if ignore_errors:
                 return
-        else:
-            yaml = yaml_from_file(meta_filename)
+
+        yaml = yaml_from_file(meta_filename)
 
         if yaml and "galaxy_info" in yaml:
             galaxy_info = yaml["galaxy_info"]
@@ -900,3 +935,22 @@
 def _get_galaxy_role_name(galaxy_infos: dict[str, Any]) -> str:
     """Compute role name from meta/main.yml."""
     return galaxy_infos.get("role_name", "")
+
+
+def search_galaxy_paths(search_dir: Path) -> list[str]:
+    """Search for galaxy paths (only one level deep)."""
+    galaxy_paths: list[str] = []
+    for file in [".", *os.listdir(search_dir)]:
+        # We ignore any folders that are not valid namespaces, just like
+        # ansible galaxy does at this moment.
+        if file != "." and not namespace_re.match(file):
+            continue
+        file_path = search_dir / file / "galaxy.yml"
+        if file_path.is_file():
+            galaxy_paths.append(str(file_path))
+    return galaxy_paths
+
+
+def is_url(name: str) -> bool:
+    """Return True if a dependency name looks like an URL."""
+    return bool(re.match("^git[+@]", name))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/src/ansible_compat/schema.py 
new/ansible-compat-4.1.10/src/ansible_compat/schema.py
--- old/ansible-compat-4.1.5/src/ansible_compat/schema.py       2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/src/ansible_compat/schema.py      2023-09-06 
14:55:41.000000000 +0200
@@ -1,16 +1,19 @@
 """Utils for JSON Schema validation."""
+from __future__ import annotations
+
 import json
 from collections.abc import Mapping, Sequence
 from dataclasses import dataclass
-from typing import Union
+from typing import TYPE_CHECKING
 
 import jsonschema
 from jsonschema.validators import validator_for
 
-from ansible_compat.types import JSON
+if TYPE_CHECKING:
+    from ansible_compat.types import JSON
 
 
-def to_path(schema_path: Sequence[Union[str, int]]) -> str:
+def to_path(schema_path: Sequence[str | int]) -> str:
     """Flatten a path to a dot delimited string.
 
     :param schema_path: The schema path
@@ -19,7 +22,7 @@
     return ".".join(str(index) for index in schema_path)
 
 
-def json_path(absolute_path: Sequence[Union[str, int]]) -> str:
+def json_path(absolute_path: Sequence[str | int]) -> str:
     """Flatten a data path to a dot delimited string.
 
     :param absolute_path: The path
@@ -44,7 +47,7 @@
     data_path: str
     json_path: str
     message: str
-    expected: Union[bool, int, str]
+    expected: bool | int | str
     relative_schema: str
     validator: str
     found: str
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/ansible-compat-4.1.5/src/ansible_compat.egg-info/PKG-INFO 
new/ansible-compat-4.1.10/src/ansible_compat.egg-info/PKG-INFO
--- old/ansible-compat-4.1.5/src/ansible_compat.egg-info/PKG-INFO       
2023-07-21 12:58:43.000000000 +0200
+++ new/ansible-compat-4.1.10/src/ansible_compat.egg-info/PKG-INFO      
2023-09-06 14:56:06.000000000 +0200
@@ -1,6 +1,6 @@
 Metadata-Version: 2.1
 Name: ansible-compat
-Version: 4.1.5
+Version: 4.1.10
 Summary: Ansible compatibility goodies
 Author-email: Sorin Sbarnea <ssbar...@redhat.com>
 Maintainer-email: Sorin Sbarnea <ssbar...@redhat.com>
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/ansible-compat-4.1.5/src/ansible_compat.egg-info/SOURCES.txt 
new/ansible-compat-4.1.10/src/ansible_compat.egg-info/SOURCES.txt
--- old/ansible-compat-4.1.5/src/ansible_compat.egg-info/SOURCES.txt    
2023-07-21 12:58:43.000000000 +0200
+++ new/ansible-compat-4.1.10/src/ansible_compat.egg-info/SOURCES.txt   
2023-09-06 14:56:06.000000000 +0200
@@ -61,12 +61,15 @@
 test/test_prerun.py
 test/test_runtime.py
 test/test_runtime_example.py
+test/test_runtime_scan_path.py
 test/test_schema.py
 test/assets/requirements-invalid-collection.yml
 test/assets/requirements-invalid-role.yml
 test/assets/validate0_data.json
 test/assets/validate0_expected.json
 test/assets/validate0_schema.json
+test/assets/galaxy_paths/.bar/galaxy.yml
+test/assets/galaxy_paths/foo/galaxy.yml
 test/collections/acme.broken/galaxy.yml
 test/collections/acme.goodies/galaxy.yml
 test/collections/acme.goodies/molecule/default/converge.yml
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/ansible-compat-4.1.5/test/collections/acme.goodies/galaxy.yml 
new/ansible-compat-4.1.10/test/collections/acme.goodies/galaxy.yml
--- old/ansible-compat-4.1.5/test/collections/acme.goodies/galaxy.yml   
2023-07-21 12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/test/collections/acme.goodies/galaxy.yml  
2023-09-06 14:55:41.000000000 +0200
@@ -8,6 +8,7 @@
 dependencies:
   community.molecule: ">=0.1.0" # used to also test '=>' condition
   ansible.utils: "*" # used to also test '*'
+  git+https://github.com/ansible-collections/community.crypto.git: main # 
tests ability to install from git
 build_ignore:
   - "*.egg-info"
   - .DS_Store
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/test/conftest.py 
new/ansible-compat-4.1.10/test/conftest.py
--- old/ansible-compat-4.1.5/test/conftest.py   2023-07-21 12:58:17.000000000 
+0200
+++ new/ansible-compat-4.1.10/test/conftest.py  2023-09-06 14:55:41.000000000 
+0200
@@ -1,6 +1,12 @@
 """Pytest fixtures."""
+import importlib.metadata
+import json
 import pathlib
+import subprocess
+import sys
 from collections.abc import Generator
+from pathlib import Path
+from typing import Callable
 
 import pytest
 
@@ -26,3 +32,96 @@
     instance = Runtime(project_dir=tmp_path, isolated=True)
     yield instance
     instance.clean()
+
+
+def query_pkg_version(pkg: str) -> str:
+    """Get the version of a current installed package.
+
+    :param pkg: Package name
+    :return: Package version
+    """
+    return importlib.metadata.version(pkg)
+
+
+@pytest.fixture()
+def pkg_version() -> Callable[[str], str]:
+    """Get the version of a current installed package.
+
+    :return: Callable function to get package version
+    """
+    return query_pkg_version
+
+
+class VirtualEnvironment:
+    """Virtualenv wrapper."""
+
+    def __init__(self, path: Path) -> None:
+        """Initialize.
+
+        :param path: Path to virtualenv
+        """
+        self.project = path
+        self.venv_path = self.project / "venv"
+        self.venv_bin_path = self.venv_path / "bin"
+        self.venv_python_path = self.venv_bin_path / "python"
+
+    def create(self) -> None:
+        """Create virtualenv."""
+        cmd = [str(sys.executable), "-m", "venv", str(self.venv_path)]
+        subprocess.check_call(args=cmd)
+        # Install this package into the virtual environment
+        self.install(f"{__file__}/../..")
+
+    def install(self, *packages: str) -> None:
+        """Install packages in virtualenv.
+
+        :param packages: Packages to install
+        """
+        cmd = [str(self.venv_python_path), "-m", "pip", "install", *packages]
+        subprocess.check_call(args=cmd)
+
+    def python_script_run(self, script: str) -> 
subprocess.CompletedProcess[str]:
+        """Run command in project dir using venv.
+
+        :param args: Command to run
+        """
+        proc = subprocess.run(
+            args=[self.venv_python_path, "-c", script],
+            capture_output=True,
+            cwd=self.project,
+            check=False,
+            text=True,
+        )
+        return proc
+
+    def site_package_dirs(self) -> list[Path]:
+        """Get site packages.
+
+        :return: List of site packages dirs
+        """
+        script = "import json, site; print(json.dumps(site.getsitepackages()))"
+        proc = subprocess.run(
+            args=[self.venv_python_path, "-c", script],
+            capture_output=True,
+            check=False,
+            text=True,
+        )
+        dirs = json.loads(proc.stdout)
+        if not isinstance(dirs, list):
+            msg = "Expected list of site packages"
+            raise TypeError(msg)
+        sanitized = list({Path(d).resolve() for d in dirs})
+        return sanitized
+
+
+@pytest.fixture(scope="module")
+def venv_module(tmp_path_factory: pytest.TempPathFactory) -> 
VirtualEnvironment:
+    """Create a virtualenv in a temporary directory.
+
+    :param tmp_path: pytest fixture for temp path
+    :return: VirtualEnvironment instance
+    """
+    test_project = tmp_path_factory.mktemp(basename="test_project-", 
numbered=True)
+    _venv = VirtualEnvironment(test_project)
+    _venv.create()
+    return _venv
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/test/test_prerun.py 
new/ansible-compat-4.1.10/test/test_prerun.py
--- old/ansible-compat-4.1.5/test/test_prerun.py        2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/test/test_prerun.py       2023-09-06 
14:55:41.000000000 +0200
@@ -6,6 +6,6 @@
 
 def test_get_cache_dir_relative() -> None:
     """Test behaviors of get_cache_dir."""
-    relative_path = Path(".")
+    relative_path = Path()
     abs_path = relative_path.resolve()
     assert get_cache_dir(relative_path) == get_cache_dir(abs_path)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/test/test_runtime.py 
new/ansible-compat-4.1.10/test/test_runtime.py
--- old/ansible-compat-4.1.5/test/test_runtime.py       2023-07-21 
12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/test/test_runtime.py      2023-09-06 
14:55:41.000000000 +0200
@@ -1,19 +1,18 @@
 """Tests for Runtime class."""
 # pylint: disable=protected-access
+from __future__ import annotations
+
 import logging
 import os
 import pathlib
 import subprocess
-from collections.abc import Iterator
 from contextlib import contextmanager
 from pathlib import Path
 from shutil import rmtree
-from typing import Any, Union
+from typing import TYPE_CHECKING, Any
 
 import pytest
-from _pytest.monkeypatch import MonkeyPatch
 from packaging.version import Version
-from pytest_mock import MockerFixture
 
 from ansible_compat.config import ansible_version
 from ansible_compat.constants import INVALID_PREREQUISITES_RC
@@ -22,7 +21,18 @@
     AnsibleCompatError,
     InvalidPrerequisiteError,
 )
-from ansible_compat.runtime import CompletedProcess, Runtime
+from ansible_compat.runtime import (
+    CompletedProcess,
+    Runtime,
+    is_url,
+    search_galaxy_paths,
+)
+
+if TYPE_CHECKING:
+    from collections.abc import Iterator
+
+    from _pytest.monkeypatch import MonkeyPatch
+    from pytest_mock import MockerFixture
 
 
 def test_runtime_version(runtime: Runtime) -> None:
@@ -466,16 +476,27 @@
     
runtime.install_collection("examples/reqs_v2/community-molecule-0.1.0.tar.gz")
 
 
+def test_install_collection_git(runtime: Runtime) -> None:
+    """Check that valid collection installs do not fail."""
+    runtime.install_collection(
+        "git+https://github.com/ansible-collections/ansible.posix,main";,
+    )
+
+
 def test_install_collection_dest(runtime: Runtime, tmp_path: pathlib.Path) -> 
None:
     """Check that valid collection to custom destination passes."""
+    # Since Ansible 2.15.3 there is no guarantee that this will install the 
collection at requested path
+    # as it might decide to not install anything if requirement is already 
present at another location.
     runtime.install_collection(
         "examples/reqs_v2/community-molecule-0.1.0.tar.gz",
         destination=tmp_path,
     )
-    expected_file = (
-        tmp_path / "ansible_collections" / "community" / "molecule" / 
"MANIFEST.json"
-    )
-    assert expected_file.is_file()
+    runtime.load_collections()
+    for collection in runtime.collections:
+        if collection == "community.molecule":
+            return
+    msg = "Failed to find collection as installed."
+    raise AssertionError(msg)
 
 
 def test_install_collection_fail(runtime: Runtime) -> None:
@@ -633,8 +654,8 @@
     ids=("1", "2", "3", "4", "5"),
 )
 def test_runtime_version_in_range(
-    lower: Union[str, None],
-    upper: Union[str, None],
+    lower: str | None,
+    upper: str | None,
     expected: bool,
 ) -> None:
     """Validate functioning of version_in_range."""
@@ -652,6 +673,7 @@
                 "ansible.posix",  # from tests/requirements.yml
                 "ansible.utils",  # from galaxy.yml
                 "community.molecule",  # from galaxy.yml
+                "community.crypto",  # from galaxy.yml as a git dependency
             ],
             id="normal",
         ),
@@ -789,3 +811,53 @@
         assert "ansible.builtin.free" in runtime.plugins.strategy
         assert "ansible.builtin.is_abs" in runtime.plugins.test
         assert "ansible.builtin.bool" in runtime.plugins.filter
+
+
+@pytest.mark.parametrize(
+    ("path", "result"),
+    (
+        pytest.param(
+            "test/assets/galaxy_paths",
+            ["test/assets/galaxy_paths/foo/galaxy.yml"],
+            id="1",
+        ),
+        pytest.param(
+            "test/collections",
+            [],  # should find nothing because these folders are not valid 
namespaces
+            id="2",
+        ),
+        pytest.param(
+            "test/assets/galaxy_paths/foo",
+            ["test/assets/galaxy_paths/foo/galaxy.yml"],
+            id="3",
+        ),
+    ),
+)
+def test_galaxy_path(path: str, result: list[str]) -> None:
+    """Check behavior of galaxy path search."""
+    assert search_galaxy_paths(Path(path)) == result
+
+
+@pytest.mark.parametrize(
+    ("name", "result"),
+    (
+        pytest.param(
+            "foo",
+            False,
+            id="0",
+        ),
+        pytest.param(
+            "git+git",
+            True,
+            id="1",
+        ),
+        pytest.param(
+            "g...@acme.com",
+            True,
+            id="2",
+        ),
+    ),
+)
+def test_is_url(name: str, result: bool) -> None:
+    """Checks functionality of is_url."""
+    assert is_url(name) == result
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/test/test_runtime_scan_path.py 
new/ansible-compat-4.1.10/test/test_runtime_scan_path.py
--- old/ansible-compat-4.1.5/test/test_runtime_scan_path.py     1970-01-01 
01:00:00.000000000 +0100
+++ new/ansible-compat-4.1.10/test/test_runtime_scan_path.py    2023-09-06 
14:55:41.000000000 +0200
@@ -0,0 +1,105 @@
+"""Test the scan path functionality of the runtime."""
+
+import json
+import textwrap
+from dataclasses import dataclass, fields
+from pathlib import Path
+
+import pytest
+from _pytest.monkeypatch import MonkeyPatch
+
+from ansible_compat.runtime import Runtime
+
+from .conftest import VirtualEnvironment
+
+V2_COLLECTION_TARBALL = 
Path("examples/reqs_v2/community-molecule-0.1.0.tar.gz")
+V2_COLLECTION_NAMESPACE = "community"
+V2_COLLECTION_NAME = "molecule"
+V2_COLLECTION_VERSION = "0.1.0"
+V2_COLLECTION_FULL_NAME = f"{V2_COLLECTION_NAMESPACE}.{V2_COLLECTION_NAME}"
+
+
+@dataclass
+class ScanSysPath:
+    """Parameters for scan tests."""
+
+    isolated: bool
+    scan: bool
+    raises_not_found: bool
+
+    def __str__(self) -> str:
+        """Return a string representation of the object."""
+        parts = [
+            f"{field.name}{str(getattr(self, field.name))[0]}" for field in 
fields(self)
+        ]
+        return "-".join(parts)
+
+
+@pytest.mark.parametrize(
+    ("param"),
+    (
+        ScanSysPath(isolated=True, scan=True, raises_not_found=True),
+        ScanSysPath(isolated=True, scan=False, raises_not_found=True),
+        ScanSysPath(isolated=False, scan=True, raises_not_found=False),
+        ScanSysPath(isolated=False, scan=False, raises_not_found=True),
+    ),
+    ids=str,
+)
+def test_scan_sys_path(
+    venv_module: VirtualEnvironment,
+    monkeypatch: MonkeyPatch,
+    runtime_tmp: Runtime,
+    tmp_path: Path,
+    param: ScanSysPath,
+) -> None:
+    """Confirm sys path is scanned for collections.
+
+    :param venv_module: Fixture for a virtual environment
+    :param monkeypatch: Fixture for monkeypatching
+    :param runtime_tmp: Fixture for a Runtime object
+    :param tmp_dir: Fixture for a temporary directory
+    :param param: The parameters for the test
+    """
+    first_site_package_dir = venv_module.site_package_dirs()[0]
+
+    installed_to = (
+        first_site_package_dir
+        / "ansible_collections"
+        / V2_COLLECTION_NAMESPACE
+        / V2_COLLECTION_NAME
+    )
+    if not installed_to.exists():
+        # Install the collection into the venv site packages directory, force
+        # as of yet this test is not isolated from the rest of the system
+        runtime_tmp.install_collection(
+            collection=V2_COLLECTION_TARBALL,
+            destination=first_site_package_dir,
+            force=True,
+        )
+    # Confirm the collection is installed
+    assert installed_to.exists()
+    # Set the sys scan path environment variable
+    monkeypatch.setenv("ANSIBLE_COLLECTIONS_SCAN_SYS_PATH", str(param.scan))
+    # Set the ansible collections paths to avoid bleed from other tests
+    monkeypatch.setenv("ANSIBLE_COLLECTIONS_PATH", str(tmp_path))
+
+    script = textwrap.dedent(
+        f"""
+    import json;
+    from ansible_compat.runtime import Runtime;
+    r = Runtime(isolated={param.isolated});
+    fv, cp = r.require_collection(name="{V2_COLLECTION_FULL_NAME}", 
version="{V2_COLLECTION_VERSION}", install=False);
+    print(json.dumps({{"found_version": str(fv), "collection_path": 
str(cp)}}));
+    """,
+    )
+
+    proc = venv_module.python_script_run(script)
+    if param.raises_not_found:
+        assert proc.returncode != 0, (proc.stdout, proc.stderr)
+        assert "InvalidPrerequisiteError" in proc.stderr
+        assert "'community.molecule' not found" in proc.stderr
+    else:
+        assert proc.returncode == 0, (proc.stdout, proc.stderr)
+        result = json.loads(proc.stdout)
+        assert result["found_version"] == V2_COLLECTION_VERSION
+        assert result["collection_path"] == str(installed_to)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/ansible-compat-4.1.5/tox.ini 
new/ansible-compat-4.1.10/tox.ini
--- old/ansible-compat-4.1.5/tox.ini    2023-07-21 12:58:17.000000000 +0200
+++ new/ansible-compat-4.1.10/tox.ini   2023-09-06 14:55:41.000000000 +0200
@@ -71,7 +71,7 @@
   PIP_DISABLE_PIP_VERSION_CHECK = 1
   PIP_CONSTRAINT = {toxinidir}/requirements.txt
   PRE_COMMIT_COLOR = always
-  PYTEST_REQPASS = 82
+  PYTEST_REQPASS = 93
   FORCE_COLOR = 1
 allowlist_externals =
   ansible
@@ -154,3 +154,22 @@
   mkdocs {posargs:build} --strict
 extras = docs
 passenv = *
+
+[testenv:smoke]
+description = Run ansible-lint own testing with current code from compat 
library
+commands_pre =
+  ansible localhost -m ansible.builtin.git -a 
'repo=https://github.com/ansible/ansible-lint dest={envdir}/tmp/ansible-lint'
+  pip install -e "{envdir}/tmp/ansible-lint[test]"
+commands =
+  bash -c "pip freeze|grep ansible"
+  pytest -k role
+deps =
+  ansible-core
+setenv =
+  {[testenv]setenv}
+  PIP_CONSTRAINT = /dev/null
+  PYTEST_REQPASS = 0
+changedir = {envdir}/tmp/ansible-lint
+allowlist_externals =
+  pwd
+  bash

Reply via email to