I didn't get a chance to include Metamath in this (see Discussion) but 
thought you'd enjoy and like to provide feedback.

EvoGPT-f: on arXiv <https://arxiv.org/abs/2402.16878>
Abstract
Formal mathematics is the discipline of translating mathematics into a 
programming language in which any statement can be unequivocally checked by 
a computer. Mathematicians and computer scientists have spent decades of 
painstaking formalization efforts developing languages such as Coq, HOL, 
and Lean. Machine learning research has converged on these formal math 
corpora and given rise to an assortment of methodologies to aid in 
interactive and automated theorem proving. However, these papers have 
primarily focused on one method, for one proof task, in one language. This 
paper introduces EvoGPT-f: a novel evolutionary framework for the first 
systematic quantitative analysis of the differential machine learnability 
of five formal math corpora (Lean 3, Lean 4, Coq, HOL 4, HOL Light) using 
four tokenization methods (character, word-level, Byte Pair Encoding and 
StarCoder tokenizer). This paper does not put to rest the question of the 
"best" or "easiest" language to learn. Rather, this framework and 
preliminary findings begin to illuminate the differential machine 
learnability of these languages, offering a foundation to forge more 
systematic quantitative and qualitative comparative research across 
communities.

-- 
You received this message because you are subscribed to the Google Groups 
"Metamath" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/metamath/e217a5c9-320c-4280-931b-3de970ca69c1n%40googlegroups.com.

Reply via email to