Bruno Haible <br...@clisp.org> writes:

> By now, I've completed a "many platforms" CI for several GNU packages:
>
> https://github.com/gnu-gnulib/ci-testdir-check/actions
> https://github.com/gnu-diffutils/ci-check/actions
> https://github.com/gnu-gettext/ci-check/actions
> https://github.com/gnu-libsigsegv/ci-check/actions
> https://github.com/gnu-libunistring/ci-check/actions
> https://github.com/gnu-m4/ci-check/actions
> https://github.com/gnu-sed/ci-check/actions
>
> And while so far I've been satisfied with writing these .yml files manually,
> it would be better if they all were generated by some script. The reason is:
>
>   * Adding a new platform, or making some other change, implies a lot of 
> copying.
>
>   * Applying the template to a new package requires looking at existing
>     CI actions. For example, C++ is used in 2-3 packages. If a new package 
> needs
>     C++, one needs to find one of the packages that already uses C++ and then
>     copy the relevant pieces. Which is more tedious than just writing 
> "use-c++: yes"
>     in some place.
>
> I'm not aware of a template system at this level. I therefore plan to go
> with a (Python?) script that generates the .yml file from some specification.

I'm also annoyed by having all these repetitions, and I've also settled
with copying code between them and keep them in sync manually so far.

I haven't found a way to easily abstract up things in the GitLab CI/CD
YAML language to make an 'include' approach workable.  I'm fairly
certain that is possible to do, though.  That would 1) have a pipeline
native approach and avoid the dependency on a python script to generate
GitLab CI/CD scripts, and, maybe more important: 2) simplify how to
commit updates of the generated scripts when a change is made to the
generation framework.

How do you intend the output of your python script to end up accessible
for consumption by GitLab pipelines?  A simple approach is to run it
locally and commit all the generated files to git.  Is that what you had
in mind?

Another idea is to run the script during the pipeline, and (at least for
GitLab) use a sub-pipeline that takes the generated script code and runs
that as a pipelin.  This avoids commiting generated files.  It adds a
bit of complexity and debugging, but can work fine.

Maybe there are other approaches too?

I'm not that familiar with GitHub actions, but I assume most of the same
concepts are the same...

Eric Blake <ebl...@redhat.com> writes:

> Have you looked at libvirt-ci[1]?  Several projects (among others:
> libvirt, qemu, libnbd) use this for templating CI that works across
> both gitlab and github (preferring to run CI under gitlab as it is
> less invasive of freedoms, but taking advantage of github's access to
> MacOS testing).
>
> [1] https://gitlab.com/libvirt/libvirt-ci/
>
> Among other things, the lcitool program inside that package makes it
> possible to write a fairly stream-lined description of general package
> names that are required, then looks up in a database the correct
> spellings of how to pull in those packages across various distros
> present in CI systems, so that I end up modifying only one .yml file
> instead of one per platform when I need to tweak the CI setup to track
> a change in various distro bleeding edge.

Great pointer, thank you!  That's one of the challenges I had.  This
database of package name mappings seems like a generally re-usable
concept.  Is anyone aware of any effort to address this as a general
problem?  Rather than being a small component of a larger work.
Extending Bruno's example to include macOS Homebrew too, it would look
like this:

    readline:
      apk: readline-dev
      deb: libreadline-dev
      rpm: readline-devel
      homebrew: readline

Maybe it should actually not assume the consumer always wants developer
packages: sometimes you want to install what the end-user will have on a
system for some testing.  So it could be:

    readline:
      apk: readline
      deb: libreadline8
      rpm: readline
      homebrew: readline

    devel-readline:
      apk: readline-dev
      deb: libreadline-dev
      rpm: readline-devel
      homebrew: readline

Also I think you'll also quickly discover that different releases need
use different package names, so something like this may be useful:

    libmailutils:
      deb: libmailutils9
      deb_stretch: libmailutils6

And the tools could use "deb_$(lsb_release -s -c)" before "deb".
Thoughts?

Btw, GitLab has had macOS shared runners for quite some time now, and
I've found them stable.  The Windows runners have been less stable but
for the past months they have been working for me.

/Simon

Attachment: signature.asc
Description: PGP signature

Reply via email to