On 8/6/23 00:25, Carles Pina i Estany wrote:
When I say "long time" (and data transmission) in my case it's 9
minutes:
-----
carles@pinux:[master]~/git/wget2$ time git submodule update --init
Cloning into '/home/carles/git/wget2/gnulib'...
Submodule path 'gnulib': checked out '2ae6faf9c78384dc6a2674b62dd56ff153cd51f6'

real    9m1,135s
user    6m24,309s
sys         0m5,020s
-----

Not answering your question, but may be helpful:

If you regularly build wget2 from git, you only have to do download gnulib once. When the gnulib submodule becomes updated by the project (happens from time to time), only the missing parts are downloaded by git, which should be fast even on slow network connections.

Another option is to git clone gnulib into a separate directory outside the project directory and set the env variable GNULIB_REFDIR to this directory (e.g. "export GNULIB_REFDIR=/home/carles/git/gnulib"). When needed (or eventually), use "git pull" from inside the gnulib directory to update it.

The `./bootstrap` script in the wget2 project then fetches the needed gnulib commits from from $GNULIB_REFDIR.

To speed things up in container CI environments:
If containers are only used once, git clone gnulib at image creation time and do "rmdir gnulib && mv /gnulib . && git submodule update gnulib" in the container. This is still experimental, just started using it yesterday without experiencing any downsides so far.

Regards, Tim

Attachment: OpenPGP_signature
Description: OpenPGP digital signature

Reply via email to