I happened to come across a 1993 opinion piece, Theorems for a price: Tomorrow's semi-rigorous mathematical culture by Doron Zeilberger. I think it's a rather fascinating document as it questions the future of mathematical proof. Its basic thesis is that some time in the future of mathematics, the expectation of proof will move to a "semi-rigorous" state where mathematical statements will be given probabilities of being true.

It helps to clarify this with an example even more simple than in Zeilberger's paper. Take the arithmetic-geometric mean inequality for two variables $a,b\geq 0$. It says that

$$\frac{a + b}{2} \geq \sqrt{ab}.$$ This simple identity is just a rearrangement of the inequality $(a – b)^2 \geq 0$. For simplicity, let's say that $a,b\in [0,1]$. Instead of actually proving this inequality, we could generate uniform random numbers in $[0,1]$ and see if this inequality actually works for them. So if I test this inequality 1000 times, of course I will get that it works 1000 times.

Of course, to actually make sense of this experimentally-verified testing procedure in the place of proof, you need to actually estimate the probability that a given inequality will pass the test, given that it is false. For example, the inequality

$$\frac{a + b}{2} \geq \sqrt{ab + \epsilon}$$ is false for $\epsilon > 0$. If $\epsilon = 10^{-5}$ for instance, then this false inequality has a good chance of passing several dozen iterations. In fact, the reader may find it interesting to work out the probability that this fake inequality will actually be true for a randomly chosen pair $(a,b)$ as a function of $\epsilon$.

Zeilberger's paper describes a more sophisticated example that actually has practical implications: hypergeometric sum identities. In this case, certain sum identities can be proved algorithmically. The point with regard to proof is that the same algorithmic verifications can be replaced with nearly-certain verifications that give probabilities that an identity is correct.

Zeilberger believes that more mathematics will be subsumed into the realm of probabilistic verification, by translating them into algorithmic statements that in turn can be given a probability of being correct via machine experimentation.

## Conclusions

What can we draw from Zeilberger's paper? First, it is an important attempt to bring acceptance to a different way of mathematical exploration. His idea is not new in the sense that mathematicians on their own have always tested out their ideas with experimentation and heuristic reasoning. People even publish papers assuming conjectures they believe to be true. However, by focusing on a formal structure: probabilities of statements being true and computer verifications, we get a more rigid structure for experimentally-verified mathematics.

I am skeptical that Zeilberger's prediction will unfold, however. Much of pure mathematics seems very far away from verifying equations, whereas hypergeometric sums (or indeed any kind of sums involving subsets of the real numbers) seem ideally amenable to it. While there are active attempts to formalize mathematics through proof systems, in decades since their introduction we have not really seen any proof system that is actually easy for the average mathematician to use. And then there is the question of whether people could really implement semi-rigorous math in a proof system in the first place.

Finally, I think there is a limitation to the idea of reasonably-sure proofs or semirigorous mathematics. This limitation is that much of the time, there may not be much of a purpose to knowing some statements to a reasonably sure level. Part of the mathematical practice is to understand, and if we have a highly complex system that can give us reasonably sure results without explanation, how would that aid our further understanding of mathematics? It is with this closing thought that I believe rigorous proof will always have a place in mathematics, even though computer-aided mathematics will inevitably become more present in our future explorations.