Am 15.10.2019 18:16 schrieb Johan Ouwerkerk:
On Tue, Oct 15, 2019 at 9:17 AM Frederik Schwarzer <schwar...@kde.org> wrote:
Now I will fix my latest revision and merge to master. Still: 19 commits
are not compiling anymore.

Or am I missing something here?

How would we deal with that? Is "short-lived branches" (as you stated
below) enough to reduce the risk?


To answer in reverse order:

Yes. On the one hand short-lived branches reduce this risk
considerably: this scenario applies to breaking changes in master
which fundamentally alter the way the application works. It's like a
core library upgrade or, in this case, a major UX rewrite: something
fairly fundamental to the application changes in master. It is
unlikely that this should happen overnight without any kind of prior
review.

Still, this can and does happen and will happen some day to you if you
contribute enough :) However this gets back at the git rebase bit.

So yes, you rebase your feature on master as per normal, fix transient
merge conflicts and then what? Well, then you still have to compile &
test. At which point you notice breakage. How do you recover? As you
would: you begin the porting effort, either changes from master to
your new feature way of doing things (in case *you* are the one doing
the UX rewrite/major refactoring), or vice versa you apply the new
world order from master to your feature.

What I like to do during this process is to avoid committing these
fixes just yet. I want to get a feel for the total diff, in particular
the total git diff --stat that I accumulate. Then I can identify on a
file-by-file basis using something like git log -3 path/to/file or so
what the likely commit is which should have been amended. Sometimes
you notice the diff for a file should be spread over multiple commits
according to your prior log, so what you do next is you use git add
like this: git add -p path/to/file. You only select the bits for which
you have identified a particular commit, you commit those added hunks
and here I like to leave a note in the first line of the commit to the
effect of "fixup <hash>" or "squash <hash>" or "<delete hash>".

In this way you build up a bunch of commits which cover your fixes.
Next up, you turn to git rebase again, using e.g. git rebase -i
master. Now you can interactively fold the commits into the history as
"it ought to be" and this is where I use my notes to help me decide
how to proceed. Note you don't have to get everything just right, and
note that this rebasing itself may introduce transient merge conflicts
you need to fix: so if the diff stat was large it makes sense to split
this up into multiple git rebase -i runs just to give yourself a break
in between. Finally perhaps you rebase again to touch up a few commit
messages or something, and if this whole process took considerable
amount of time you want to verify that upstream master has not yet
moved on by that point.

So in this more complex case you can adopt a correspondingly more
complex git workflow and use rebase to produce clean commits. Now,
sometimes you decide this is all too much work, and too much bother.
What you can do in such a scenario instead is to create a fresh new
branch from master and effectively re-create commits there. In those
cases git cherry-pick and git checkout -p <branch> -- path/to/file
come in handy

Ultimately whether or not a scrupulously clean commit log is worth the
effort or whether you might decide to simplify things a little and
accept a few broken commits in between mostly depends on the needs of
your project and how many people work on it.

Thanks for the explanation. :)

Just to clarify. I am not opposing the idea of enabling fast-forward merges. It seems to be a widely-used feature after all. I just wanted to throw in my concerns.

Cheers
Frederik

Reply via email to