We recently added support to git-p4 to limit the number of changes it
would try to import at a time. That was to help clients who were being
limited by the "maxscanrows" limit. This used the "-m maxchanges"
argument to "p4 changes" to limit the number of results returned to
git-p4.

Unfortunately it turns out that in practice, the server limits the
number of results returned *before* the "-m maxchanges" argument is
considered. Even supplying a "-m 1" argument doesn't help.

This affects both the "maxscanrows" and "maxresults" group options.

This set of patches updates the t9818 git-p4 tests to show the problem,
and then adds a fix which works by iterating over the changes in batches
(as at present) but using a revision range to limit the number of changes,
rather than "-m $BATCHSIZE".

That means it will in most cases require more transactions with the server,
but usually the effect will be small.

Along the way I also found that "p4 print" can fail if you have a file
with too many changes in it, but there's unfortunately no way to workaround
this. It's fairly unlikely to ever happen in practice.

I think I've covered everything in this fix, but it's possible that there
are still bugs to be uncovered; I find the way that these limits interact
somewhat tricky to understand.

Thanks,
Luke

Luke Diamand (3):
  git-p4: additional testing of --changes-block-size
  git-p4: test with limited p4 server results
  git-p4: fixing --changes-block-size handling

 git-p4.py               | 48 +++++++++++++++++++++++---------
 t/t9818-git-p4-block.sh | 73 +++++++++++++++++++++++++++++++++++++++++++------
 2 files changed, 99 insertions(+), 22 deletions(-)

-- 
2.3.4.48.g223ab37

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to