On 12/14/2014 12:05 PM, Tom Lane wrote:
Craig Ringer <cr...@2ndquadrant.com> writes:
On 12/14/2014 10:35 PM, Mark Cave-Ayland wrote:
Compare this to say, for example, huge patches such as RLS.
I specifically objected to that being flattened into a single monster
patch when I saw that'd been done. If you look at my part in the work on
the row security patch, while I was ultimately unsuccessful in getting
the patch mergeable I spent quite a bit of time splitting it up into a
logical patch-series for sane review and development. I am quite annoyed
that it was simply flattened back into an unreviewable, hard-to-follow
blob and committed in that form.
TBH, I'm not really on board with this line of argument.  I don't find
broken-down patches to be particularly useful for review purposes.  An
example I was just fooling with this week is the GROUPING SETS patch,
which was broken into three sections for no good reason at all.  (The
fourth and fifth subpatches, being alternative solutions to one problem,
are in a different category of course.)  Too often, decisions made in
one subpatch don't make any sense until you see the larger picture.

Also, speaking of the larger picture: the current Postgres revision
history amounts to 37578 commits (as of sometime yesterday) --- and that's
just in the HEAD branch.  If we'd made an effort to break feature patches
into bite-size chunks like you're recommending here, we'd probably have
easily half a million commits in the mainline history.  That would not be
convenient to work with, and I really doubt that it would be more useful
for "git bisect" purposes, and I'll bet a large amount of money that most
of them would not have had commit messages composed with any care at all.

I have tried to stay away from this thread, but ...

I'm also quite dubious about this suggested workflow, partly for the reasons Tom gives, and partly because it would constrain the way I work. I tend to commit with little notes to myself in the commit logs, notes that are never intended to become part of the public project history. I should be quite sad to lose that.

As for using git bisect, usually when I do this each iteration is quite expensive. Multiplying the number of commits by a factor between 10 and 100, which is what I think this would involve, would just make git bisect have to do between 3 and 7 more iterations, ISTM. That's not a win.

On the larger issue, let me just note that I don't believe we have what is fundamentally a technological problem, and while technological changes can of course sometimes make things easier, they can also blind us to the more basic problems we are facing.

cheers

andrew


--
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to