On Fri, May 04 2018, Jakub Narebski wrote:

> With early parts of commit-graph feature (ds/commit-graph and
> ds/lazy-load-trees) close to being merged into "master", see
> https://public-inbox.org/git/xmqq4ljtz87g....@gitster-ct.c.googlers.com/
> I think it would be good idea to think what other data could be added
> there to make Git even faster.

Thanks for the write-up.

> 3. Third, it needs to be reasonably fast to create, and fast to update.
> This means time to create the chunk to be proportional to number of
> commits, or sum of number of commits and edges (which for commit graph
> and other sparse graphs is proprtional to the number of nodes / commits
> anyway).  In my opinion time proportional to n*lug(n), where 'n' is the
> number of commits, is also acceptable.  Times proportional to n^2 or n^3
> are not acceptable.

I don't think this requirement makes sense...

>   DS> If we add commit-graph file downloads to the protocol, then the
>   DS> server could do this computation and send the data to all
>   DS> clients. But that would be "secondary" information that maybe
>   DS> clients want to verify, which is as difficult as computing it
>   DS> themselves.

... when combined with this. If hypothetically there was some data that
significantly sped up Git and the algorithm to generate it was
ridiculously expensive, that would be fine when combined with the
ability to fetch it pre-generated from the server. There could always be
an option where you trust the server and optionally don't verify the
data, or where it's much cheaper to verify than compute.

Reply via email to