Simon Josefsson via Bug reports for the GNU Internet utilities
<[email protected]> writes:

> It is usually done during the release process.  Maybe we should prepare
> for a November/December release of inetutils?  There is nothing urgent
> in git, I think, but a steady pace of releases seems to encourage review
> and contributions.

That sounds good to me. Even if there is nothing in Inetutils git,
Gnulib fixes could be nice to include in a release.

> We use www.gnu.org/s/inetutils/manual/ for distribution of manuals, but
> for about a month or so www.gnu.org is completely unreachable for me (my
> IP has probably been banned) and there are two things I would want to
> have improved:
>
> 1) use git instead of CVS

I don't think will happen anytime soon on gnu.org, but not having to
figure out 'cvs' every time I use it (once a year) would be nice.

> 2) find a mechanism to provide versioned manual so that
>
> www.whatever/inetutils/manual/2.4/
> www.whatever/inetutils/manual/2.5/
> www.whatever/inetutils/manual/latest/
>
> all work and points to the corresponding version.  It sometimes helps to
> be able to reference documentation from earlier versions.

Yep, that can be helpful. glibc does something like that [1].

> I'm mixed between two ways of making that happen:
>
> A) Store generated versioned outputs in git in a sub-directory for each
> release, and then normally never touch those files.
>
> Pro:
> - Stable links - unless changed in git, everything stays the same.
> - Low CPU consumption
>
> Con:
> - Harder to achieve output format consistency between versions
> - The output does not gain fixes from newer toolchains (texinfo etc)
> - Bitrot - will we ever be able to re-generate the same outputs?
>
> B) Write scripts that check out all releases and re-generate the
> documentation tree.
>
> Pro:
> - Toolchain fixes ends up improving outputs
> - We defend against bitrot in the generation process
> - Easier to implement cross-version consistency
>
> Con:
> - Risk that URLs stops working if toolchain breaks URL stability
> guarantees
> - Waste of CPU re-generating the outputs all the time
> - Risk of having spend time debugging the scripts to generate documentation
>
> Of course, it would actually be possible to do both (regulary
> re-generate documentation but store the outputs statically in a git
> repo), which may mitigate some of the downsides with each approach.
>
> We could set this up to generate static manuals on codeberg, which would
> support the above more easily, and then keep uploading the latest
> version using CVS to www.gnu.org.  Or just a HTTP redirect in CVS to
> codeberg pages to avoid having to support both publication methods.  I
> think some of the guile/guix stuff recently started to use this.

I purchased the inetutils.org domain for safe-guarding in case we need
one for hosting the documentation elsewhere.

Collin

[1] https://sourceware.org/glibc/manual/

  • [ftpd] docu... Benjamin Cathelineau
    • Re: [f... Erik Auerswald
      • Re... Collin Funk
        • ... Erik Auerswald
      • Re... Simon Josefsson via Bug reports for the GNU Internet utilities
        • ... Collin Funk
          • ... Simon Josefsson via Bug reports for the GNU Internet utilities
            • ... Collin Funk

Reply via email to