wor...@alum.mit.edu (Dale R. Worley) writes:

> builtin/gc.c:
> static int too_many_packs(void)
> {
>       struct packed_git *p;
>       int cnt;
>
>       if (gc_auto_pack_limit <= 0)
>               return 0;
>
>       prepare_packed_git();
>       for (cnt = 0, p = packed_git; p; p = p->next) {
>               if (!p->pack_local)
>                       continue;
>               if (p->pack_keep)
>                       continue;
>               /*
>                * Perhaps check the size of the pack and count only
>                * very small ones here?
>                */
>               cnt++;
>       }
>       return gc_auto_pack_limit <= cnt;
> }
>
> Yes, perhaps you *should* check the size of the pack!
>
> What is a good strategy for making this function behave as we want it to?

Whoever decides the details of "as we want it to" gets to decide
;-).

I think what we want is a mode where we repack only loose objects
and "small" packs by concatenating them into a single "large" one
(with deduping of base objects, the total would become smaller than
the sum), while leaving existing "large" ones alone.  Daily
repacking would just coalesce new objects into the "current" pack
that grows gradually and at some point it stops growing and join the
more longer term "large" ones, until a full gc is done to optimize
the overall history traversal, or something.

But if your definition of the boundary between "small" and "large"
is unreasonably low (and/or your definition of "too many" is
unreasonably small), you will always have the problem you found.

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to