(7975.89.70) Equivalence and reduction between naturally formalized proofs
#proof-theory #philosophy-of-math
In the last post, I suggested that Furstenburg, Mercer, and Euclid's proofs are essentially the same proof, insofar as there is a natural notion of reduction between them, in the direction
where these arrows are furnished by implications between corresponding auxiliary lemmas in respective proofs. Sieg's natural formalization [1] schema is particularly well-suited for this application. It uses ZF as its deductive framework (but could equally well use Martin-Lof type theory, higher order logic, etc...) and Gentzen's natural deduction as an inferential mechanism, with two key additions:
- lemmas as rules: we can "reason with gaps" by applying lemmas directly as inferential rules, with each instance generating a new subgoal (the proof of that lemma). Of course, these proofs of lemmas can rely on lemmas of their own, and so on. This structure can be conveniently captured in Fitch diagram notation.
- intercalation: this allows us to "close the gaps" created applying lemmas as rules by reasoning both "forward" and "backward," in the natural way (as, e.g is implemented in automated proof solvers). That is, we can apply a lemma
in one of two ways to a goal : - forward: if
matches , the goal becomes - backward: if
matches , the goal becomes
- forward: if
To better understand this notion of "reduction of informal proofs" in terms of implications between corresponding auxiliary lemmas, consider the following renderings [2] of the proofs by Mercer and Euclid, presented in the previous note.

As discussed in the first note, we have the implication
To show that
where the rule/lemma
Future directions
Though Euclid's proof gives a nice prototype, it remains to find a more general method that will not become inordinately difficult when evaluating more complex mathematical proofs. Promising directions thus far are:
- finding a corresponding graph-theoretic or combinatorial reduction, in analog to Hughes' and Strassburger's first order combinatorial proofs. Finding such a combinatorial criterion for proof equivalence and reduction would both avoid syntactic complications and make this computationally tractable.
- introducing a category-theoretic framework in which to understand proofs, in which certain "minimal" proofs (Euclid's, in this case) can be viewed as terminal objects or fixed points in appropriately conceived categories, with proofs between corresponding auxiliary lemmas furnishing the morphisms, for example. This approach will likely build on Hughes' "Deep inference proof theory equals categorical proof theory minus coherence"[3] framework, as the deep inference mechanism seems very promising in this application.
- In the example above, the implication
forms the commutative diagram below. It remains to understand the proof-theoretic implications of mathematical criterion like this (i.e. diagrams between lemmas commuting/not-commuting, etc...). This is of course related to the category-theoretic approach above.
\usepackage{tikz-cd}
\begin{document}
\begin{tikzcd}
A &&& C \\
\\
B &&& D
\arrow[from=1-1, to=1-4]
\arrow[from=1-1, to=3-1]
\arrow[from=1-4, to=3-4]
\arrow[from=3-1, to=3-4]
\end{tikzcd}
\end{document}
- We can also talk about mathematical purity in this sense. In particular, can we give a precise proof-theoretic characterization of mathematical purity based on this notion of reduction? And, say, if proof
is pure and , is pure as well? This is motivated by the fact that mathematical purity was one of the primary considerations for this notion of proof reduction; in particular, methodologically distinct proofs of the infinitude of primes (say, Gauss' or Euler's proof) don't fit into the "Furstenburg Mercer Euclid" framework; different methodologies inhabit connected components in the class of all proofs of this proposition. Some notion of analyzing these connected components topologically/graph-theoretically is also possible, way down the line.
See https://philpapers.org/rec/SIENFD-2 for Sieg's natural formalization of the Cantor-Bernstein theorem ↩︎
These predicate renderings are quite informal; it remains to completely reduce them according to (an appropriately modified version of) Sieg's schema. ↩︎