Hi,
On 18 Aug 2023 at 13:16:15, Paul Boddie wrote: > On Friday, 18 August 2023 09:51:29 CEST Carles Pina i Estany wrote: > > > > I'm not a Debian developer but I have some experience on Salsa CI, so I > > thought that I might be able to help... but then I was confused by a > > specific part of the message: > > > > On 17 Aug 2023 at 17:10:08, Paul Boddie wrote: > > > > [...] > > > > > For another package I have been working on, the Salsa CI facility has > > > proven to be usable, configured using files in debian/test, particularly > > > as it allows test-related dependencies to be specified independently. > > > However, this other package has no dependencies that are currently > > > unpackaged in Debian. Meanwhile, the testing of this new Moin package > > > depends on brand new packages somehow being made available. > > > > If this dependencies are available on the "build" step: could they be > > made available on the autopkgtest? I didn't quite understand why this is > > not possible. I've found the autopkgtest quite flexible (since the tests > > are scripts that could prepare some environment) > > The package has dependencies on installation but these dependencies > are not strictly necessary when building. However, if I wanted to run I have the same situation on this side. Two solutions for two places. Debusine uses https://salsa.debian.org/salsa-ci-team/pipeline approach (extending it) > the test suite when building, I would indeed need to pull in these > dependencies as build dependencies so that the software being tested > can run without import errors. In our repo I have two solutions (or hacks, or workarounds) for this situation. AUTOPKGTEST TIME (after build) ============================== In the autopkgtest job (from salsa-ci/pipeline), autopkgtest runs a shell script (below integration-tests-generic). Some of them set things up: ---------- Tests: integration-tests-generic Depends: debusine-client, debusine-server, debusine-worker, postgresql, redis-server, sudo, nginx Restrictions: allow-stderr, needs-root ---------- It could pull things from the internet (Pypi, git...) if it also has, in Restrictions: "needs-internet" Alternatively (and perhaps better, to keep things separated?), you could use salsa CI variable: SALSA_CI_AUTOPKGTEST_ARGS with this format: SALSA_CI_AUTOPKGTEST_ARGS: '--setup-commands=ci/pin-django-from-backports.sh Instead of pinning you could install packages. I don't know if it needs "needs-internet" in Restrictions. Note that the debusine-* packages are made available automatically thanks to Salsa CI pipeline and that tests-autopkgtest has: ------ needs: - job: build artifacts: true ------ So I don't need to do any repository. The build job just creates the *.deb packages (in a specific directory?), they are saved as artifacts and made available to "autopkgtest" job. UNIT-TESTS (before build) ========================= We have a job before build doing (I'm simplifying): ----- unit-tests: stage: upstream-tests dependencies: [] script: - NON_INTERACTIVE=1 bin/quick-setup.sh install_packages - make coverage # to run the tests with coverage ----- In "bin/quick-setup.sh" it installs the required packages (some from Debian, some from Pypi if need to be) In here we execute tests before packaging them (because we are upstream as well). > I have to add that the other package I refer to has a test suite that takes a > long time to run, so that is another reason why I chose Salsa CI for that > package instead of letting autopkgtest do its work: Note that when I say "autopkgtest" above I always think of autopkgtest ran by Salsa CI. > One can imagine having a common storage area holding these newly Until now I see "the whole internet" as a "common storage area" (I know, not ideal! could be closer to the process, more robust, faster, etc.) > introduced packages that the CI scripts could access in preference to > the usual archives. In fact, this would be something that might also > affect existing packages. Consider the situation where fixes to a > dependency are required to fix functionality in a particular package. > One would have to wait for the fixed dependency to become integrated > into the unstable archive before the principal package's tests would > start to work again. Semi-offtopic: MMmmm... Debusine is the project that I mentioned before where I implemented different things in .gitlab-ci.yml. What you mentioned here is a thing that debusine, in the future (not impelmented yet) might help. But other people can talk better about this than me... and it's not available right now. > > I also created a repo and hosted it on a Salsa CI page for internal > > testing but this is a bit of a workaround. This is in a new job but just > > download the artifacts (via a Salsa CI dependency) and run > > dpkg-scanpackages and copy the files to the right place). > > This sounds like something related to what might be required. In effect, you > seem to be doing what I am doing when I actually install my built packages in > a chroot. I run apt-ftparchive (which runs dpkg-scanpackages) to generate > Packages, Sources and Release files that tells apt about the new packages > when > added as another package source. Making the packages available on GitHub Pages is a workaround to make them available to end users via a repository. I'm happy to discuss but it should not be needed for the tests (or I didn't need it for the tests). For the jobs it is happening via https://salsa.debian.org/salsa-ci-team/pipeline/#using-automatically-built-apt-repository > In the Salsa CI environment, I would need to have the built packages (found > in > the artefacts for each package's build job) copied somewhere that can then be > found by the Moin package's pipeline jobs and the scripts creating a special > repository of new packages. Archiving artifacts should happen automatically on the "build" step of Salsa CI (salsa-ci/pipeline). If I understand correctly what you wanted... Cheers, -- Carles Pina i Estany https://carles.pina.cat || Wiktionary translations: https://kamus.pina.cat