On Sat, Feb 22, 2020 at 7:22 PM Norman Megill <[email protected]> wrote:

> I'll recall that vvs mentioned this on Feb. 10:
> https://groups.google.com/d/msg/metamath/Fgn0qZEzCko/h9PRflr9FgAJ
>
>
> https://xenaproject.wordpress.com/2020/02/09/where-is-the-fashionable-mathematics/:
> "What if an undergraduate wants to try formalising the Hilbert basis
> theorem?  What do they do?"  Well, possibly they might look to
> http://us.metamath.org/mpeuni/hbt.html for some hints.  :)  BTW one
> purpose of the "Quantum Logic Explorer" (ql.mm) was to verify some proofs
> for a couple of papers, because the logic can be somewhat non-intuitive,
> and I don't think I'd have peace of mind not knowing if there was a flaw in
> one of my published claims.  On several occasions I wrote the short,
> informal published proof using the formal proof as a guide.  The formal
> proof, in turn, was based on scribbles on paper that sometimes I'd discover
> were wrong.
>
> https://xenaproject.wordpress.com/2018/10/07/what-is-the-xena-project/:
> "It will probably cost hundreds of millions of pounds worth of person-hours
> to digitise any one of the known proofs of Fermat’s Last Theorem, and once
> digitised, these proofs will need maintaining, like any software, which
> will cost more money."
> A curious estimate - I wonder why he thinks it would be that much.
> Although if the formalization language is stable, it shouldn't need
> "maintaining", unlike ordinary software in which there are almost always
> unknown bugs remaining.  The point of a formal proof (verified with a small
> trusted kernel) is to eliminate the possibility of a mistake or bug; once a
> formal proof is completed, it is finished for all posterity, aside from any
> optional tweaks such as proof shortening that people may wish to make.
>

I agree that Buzzard overestimates the cost here. At this point I've almost
become accustomed to formalization to the point that nothing actually
appears out of reach any more. Buzzard, Commelin, and Massot recently
formalized the definition of a perfectoid space in Lean. They had geared up
for a multi-year project but got it done in only six months, and the delay
was mostly due to learning how to use the program and work with the system,
writing things so that lean will help rather than get in the way. If I
could somehow transplant Kevin Buzzard's brain into my head (or someone
else who understands the proof of PNT inside out), I am reasonably
confident I could do it alone within a year or two. It's much more a
problem of getting the relevant knowledge of mathematics and computer proof
architecture into one place, and the lean zulip chat has been tremendously
helpful in that regard.

Regarding the cost of maintenance, it can mostly be rolled up with
maintenance of the overall ecosystem, but it is nonzero, at least for Lean.
Metamath is actually an outlier in the space of formal provers for a lot of
reasons, but one of them is that there is a spec, that was written almost
30 years ago and is essentially unchanged since then. Most computer
languages evolve faster than that. You can't take a proof from isabelle
2019 and run it on isabelle 2020, much less an isabelle 2000 proof. Things
change in a myriad small ways, and it is important that the proofs get
updated to follow along. In reality, while metamath is stable, set.mm
certainly is not, and we have a similar system of centralized maintenance
to keep mathboxes updated when theorems in the main part change.

Of course, it's not like the software or proofs literally rot without
tending; they are still just as true as ever, but now they may run on
"outdated theorems", and may be difficult to reconcile with more modern
developments based on different versions of core library theorems. You end
up with a state of affairs which is easily recognizable to anyone who has
dealt with package version management of large and diverse software
ecosystems.


> On https://xenaproject.wordpress.com/2019/12/07/rigorous-mathematics/ he
> has a link to Mario's Metamath proof of the prime number theorem.
>

Oh nice, I hadn't noticed that. Really, there are better links to give for
the formalization of PNT at this point - Jeremy Avigad / Isabelle was
first, John Harrison / HOL light had the complex analysis version, then I
did it in metamath, and more recently Manuel Eberl / Isabelle proved most
of a number theory textbook with some significant generalizations of the
PNT. But I'm flattered nonetheless. :)

Mario

-- 
You received this message because you are subscribed to the Google Groups 
"Metamath" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/metamath/CAFXXJSsN6cnCZYWRk%2BAa1oM47a-crHN9hnbK_hapG5RN3cUs-Q%40mail.gmail.com.

Reply via email to