Hi,

On Wed, May 27, 2020 at 10:58:55AM +0200, Mans Zigher wrote:
> This is maybe more related to bitbake but I start by posting it here.
> I am for the first time trying to make use of a distributed sstate
> cache but I am getting some unexpected results and wanted to hear if
> my expectations are wrong. Everything works as expected when a build
> node is using a sstate cache from it's self so I do a clean build and
> upload the sstate cache from that build to our mirror. If then do a
> complete build using the mirror I get a 99% hit rate which is what I
> would expect. If I then start a build on a different node using the
> same cache I am only getting a 16% hit rate. I am running the builds
> inside docker so the environment should be identical. We have several
> build nodes in our CI and they where actually cloned  and all of them
> have the same HW. They are all running the builds in docker but it
> looks like they can share the sstate cache and still get a 99% hit
> rate. This to me suggests that the hit rate for the sstate cache is
> node depending so a cache cannot actually be shared between different
> nodes which is not what I expected. I have not been able find any
> information about this limitation. Any clarification regarding what to
> expect from the sstate cache would be appreciated.

We do something similar except we rsync a sstate mirror to build
nodes from latest release before a build (and topic from gerrit
are merged to latest release too to avoid sstate and build tree getting
too out of sync).

bitbake-diffsigs can tell you why things get rebuild. The answers
should be there.

Also note that docker images are not reproducible by default
and might end up having different patch versions of openssl etc
depending on who build them and when. One way to work around this
is to use e.g. snapshots.debian.org repos for Debian containers
with a timestamped state of the full package repo used to generate
the container. I've done something similar but manual on top of
debootstrap to create a build rootfs tarball for lxc.

Hope this helps,

-Mikko
-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.

View/Reply Online (#49496): https://lists.yoctoproject.org/g/yocto/message/49496
Mute This Topic: https://lists.yoctoproject.org/mt/74496060/21656
Group Owner: yocto+ow...@lists.yoctoproject.org
Unsubscribe: https://lists.yoctoproject.org/g/yocto/unsub  
[arch...@mail-archive.com]
-=-=-=-=-=-=-=-=-=-=-=-

Reply via email to