Roughly a fortnight ago, I wrote 
<https://github.com/xamidi/pmGenerator/commit/5ff38720918bb0c7624e015dad2d53f8bbae3e28>
 
a C++ procedure to *convert Fitch-style natural deduction proofs* of 
propositional theorems *to Hilbert-style proofs*.
I released it today as part of pmGenerator 1.2.2 
<https://github.com/xamidi/pmGenerator/releases/tag/1.2.2>.

Since Metamath concerns Hilbert systems I'd like to share a few insights I 
learned along the way, apart from presenting the new feature. In 
particular, I noticed that Mario Carneiro gave a talk 
<https://us.metamath.org/other.html#natded> on "*Natural Deduction in the 
Metamath Proof Language*", but it doesn't seem to have gained much 
traction; at least I do not see many theorems in Metamath that would be 
crucial to apply proof design based on natural deduction.

In my opinion, the option to design such proofs gives a great advantage 
since they are very easy for humans to find (at least at the propositional 
level), whereas Hilbert-style proofs are rather complicated (even at the 
propositional level).

For demonstration, here's a proof constructor to try out natural deduction 
proof design: https://mrieppel.github.io/FitchFX/
My converter is using the same syntax (with "Deduction Rules for TFL 
<https://mrieppel.github.io/FitchFX/samples/quick-ref-rules.pdf>" only). 
Some exemplary proofs are: m_ffx.txt 
<https://xamidi.github.io/pmGenerator/data/m_ffx.txt>, w1_ffx.txt 
<https://xamidi.github.io/pmGenerator/data/w1_ffx.txt>, …, w6_ffx.txt 
<https://xamidi.github.io/pmGenerator/data/w6_ffx.txt> — of the seven 
minimal single axioms of classical propositional logic with operators 
{→,¬}. These files can also be imported via copy/paste into the FitchFX 
tool under the "Export" tab.
*Key Concepts* 

[*Note*: In the following, exponents ⁿ (or ^n) mean n-fold concatenation of 
sequences, and D stands for (2-ary) condensed detachment in prefix 
notation, i.e. most general application of modus ponens, taking a proof of 
the conditional as first and a proof of the antecedent as second argument.] 

   - Most rules can be enabled by using a corresponding theorem. For 
   example, p→(q→(p∧q)) (pm3.2 <https://us.metamath.org/mpeuni/pm3.2.html>) 
   can be used — in combination with two modus ponens applications — to apply 
*conjunction 
   introduction*, i.e. ∧I: Γ∪{p,q}⊢(p∧q). There may be multiple 
   rule-enabling theorems, for example p→(q→(q∧p)) (pm3.21 
   <https://us.metamath.org/mpeuni/pm3.21.html>) can accomplish the same 
   thing by changing the order of arguments. I provided a table of 
   rule-enabling theorems at nd/NdConverter.h 
   <https://github.com/xamidi/pmGenerator/blob/1.2.2/nd/NdConverter.h?ts=4>, 
   many of which do not have set.mm correspondents: Of six ∨E-enabling 
   theorems there is only (jao <https://us.metamath.org/mpeuni/jao.html>), 
   and I didn't find any of (p→⊥)→¬p, ¬p→(p→⊥), p→(¬p→⊥), (¬p→⊥)→p, i.e. no 
   enabling theorems for any of ¬I, ¬E, IP.
   - *However*, in natural deduction proofs, there are blocks at certain 
   depths, each starting with an assumption.
   For example, ∧I: Γ∪{p,q}⊢(p∧q) at depth 3 is actually 
   Γ∪{a→(b→(c→p)),a→(b→(c→q)}⊢a→(b→(c→(p∧q))). Fortunately, such variants 
   can easily be constructed from the zero-depth rule-enabling theorems:
   For symbols 1 := (ax-1 <https://us.metamath.org/mpeuni/ax-1.html>) and 2 
   := (ax-2 <https://us.metamath.org/mpeuni/ax-2.html>), the proof σ_mpd(d) 
   for σ_mpd(0) := D and σ_mpd(n+1) := (σ_mpd(n))²(D1)ⁿ2 can be used to 
   apply modus ponens at depth d. For example, σ_mpd(0) is (ax-mp 
   <https://us.metamath.org/mpeuni/ax-mp.html>), σ_mpd(1) is (mpd 
   <https://us.metamath.org/mpeuni/mpd.html>), and σ_mpd(2) is (mpdd 
   <https://us.metamath.org/mpeuni/mpdd.html>). But σ_mpd(d) for d ≥ 3 seem 
   to be still missing from Metamath.
   Every theorem can be shifted one deeper by adding an antecedent via 
   preceding its proof with D1, i.e. with a single application of (a1i 
   <https://us.metamath.org/mpeuni/a1i.html>).
   In combination with σ_mpd, rule-enabling theorems can thereby be applied 
   at any depth. I gave detailed constructions of all supported rules at 
   nd/NdConverter.cpp#L538-L769 
   
<https://github.com/xamidi/pmGenerator/blob/1.2.2/nd/NdConverter.cpp?ts=4#L538-L769>
   .
   - We cannot simply make use of some rule-enabling theorem to translate 
*conditional 
   introduction*, i.e. →I: from Γ∪{p}⊢q infer Γ⊢(p→q), since it handles the 
   elimination of blocks and depth, which is necessary because Hilbert-style 
   proofs operate on a global scope everywhere. Other rules just call it in 
   order to eliminate a block and then operate on the resulting conditional.
   To eliminate an assumption p for a caller at depth d, we can replace it 
   with an appropriate proof a1_a1i(n, m) with d = n+m+1 of either 
   a₁→(…→(aₘ→(p→p))…) for n = 0, or a₁→(…→(aₘ→(p→(q₀→(q₁→(…→(qₙ→p)…)))))…) 
   for n > 0, when the assumption is used from a position n deeper than the 
   assumption's depth m+1.
   We can construct a1_a1i(n, m) based on 1 := (ax-1) and 2 := (ax-2) via 
   a1_a1i(0, m) := (D1)^mDD211, and a1_a1i(n, m) := (D1)^m(DD2D11)ⁿ1 for n 
   > 0. Note that DD211 and D2D11 are just proofs of p→p and (p→q)→(p→(r→q)), 
   respectively. Oddly, the second proof is not part of set.mm, but for its 
   theorem I found two different proofs (mercolem3 
   <https://us.metamath.org/mpeuni/mercolem3.html>) and (merco1lem15 
   <https://us.metamath.org/mpeuni/merco1lem15.html>) for which new usage 
   is discouraged. In combination with modus ponens, the theorem can be used 
   with conditionals to slip in additional antecedents.
   - In general, we can use (p→q)→(p→(r→q)) in combination with (a1i) to 
   construct proofs slip(n, m, σ) from proofs σ to slip in m new antecedents 
   after n known antecedents for a known conclusion. This makes the 
   implementation — in particular due to the possible use of reiteration steps 
   — much simpler: Regardless of from which depth *and with how many common 
   assumptions* a line is called, the appropriate numbers of antecedents 
   can be slipped in where they are needed in order to rebuild σ's theorem to 
   match the caller's context. 
   - Since the final line in the Fitch-style proof makes the initial call 
   and (for correct proofs without premises) must be in the global scope, all 
   lines can be fully decontextualized, i.e. transformed into theorems, in 
   this manner.

The core of the translation algorithm can be found at 
nd/NdConverter.cpp#L815-L947 
<https://github.com/xamidi/pmGenerator/blob/1.2.2/nd/NdConverter.cpp?ts=4#L815-L947>
 
(definition and call of recursive lambda function translateNdProof).
*Usage* 

My converter (pmGenerator --ndconvert) uses aliases by default (as 
mentioned in the header file) rather than treating connectives other than 
{→,¬} as real symbols and operators, with the same aliases that are used by 
pmproofs.txt <https://us.metamath.org/mmsolitaire/pmproofs.txt>. This is 
unlike Metamath. But there is the option -h to use heterogeneous language 
(i.e. with extra axioms to define additional operators). But then the user 
must also provide rule-enabling theorems in order to enable their 
corresponding rules for translation.

My procedure can translate into all kinds of propositional Hilbert systems, 
as long as the user provides proofs of (ax-1) and (ax-2) together with 
sufficient information for the used rules. When using {→,¬}-pure language, 
providing a proof for (ax-3 <https://us.metamath.org/mpeuni/ax-3.html>) in 
addition to (ax-1), (ax-2) is already sufficient to enable all rules.

For example, m.txt <https://xamidi.github.io/pmGenerator/data/m.txt> (which 
is data/m.txt in the release files) can be used via
pmGenerator --ndconvert input.txt -n -b data/m.txt -o result.txt 

to generate a proof based on (meredith 
<https://us.metamath.org/mpeuni/meredith.html>) as a sole axiom, for 
whichever theorem there is a FitchFX proof in input.txt. All rules are 
supported since m.txt contains proofs for (ax-1), (ax-2), and (ax-3). Since 
it also contains a proof for p→p that is shorter than building one based on 
DD211, resulting proofs use the corresponding shortcut. 

Results can then be transformed via
pmGenerator --transform result.txt -f -n […options…] -o 
transformedResult.txt 

and optionally be compressed with -z or -x to potentially find 
fundamentally shorter proofs. When exploring new systems, the hardest part 
can be to find the first proofs of sufficient theorems (or figure out they 
don't exist).
*Question* 

Regarding Metamath's databases, shouldn't there be more elements to support 
proof design based on natural deduction? One cannot prepare all the 
elements, of course, since translations of Fitch-style proofs of unlimited 
depth require infinitely many derivations — at least one per depth per 
rule-enabling theorem.

The lack of more intuitive tools for proof design in set.mm is something 
that bothered me when I discovered Metamath years ago, but I couldn't quite 
make use of it. I hope we can shed some light on the matter.

— Samiro Discher

-- 
You received this message because you are subscribed to the Google Groups 
"Metamath" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/metamath/32b98dcb-64f1-48f9-a8db-f112a1ba3b0fn%40googlegroups.com.

Reply via email to