My 15 year old son insisted to me the other day that anything true must be certain and knowable. Like father like son, he believes in truth but unlike father hasn’t learned yet about the crushing defeat that mathematics and therefore all of logic and computation experienced at the hands of Kurt Gödel in the 1930s when he proved his incompleteness theorem, ironically showing that not every true thing can be proved.
This fact is related to the nature of infinity. In essence, there are uncountably infinite true things but only countably infinite ways to prove them in the language of mathematics. It is possible to construct a proposition in much the same way you can construct a real number that has no rational representation such as pi such that you cannot necessarily prove it.
Still, Gödel rested on relatively firm philosophical ground in that he assumed that every true thing is, well, true. But how certain are we about the truth of truth?
Gödel contemporary Wittgenstein was not so sure. He believed that truth was largely a figment because it rested on human language and the philosophy of meaning. Thus, whether something is true or not depends on what you think words mean.
You may think that this is an easily solved problem because you just have to be careful and precise in your definitions as mathematicians usually are. Yet, even with the most precise definitions a proof isn’t a natural phenomenon. It is a artificial construction. This is partly why many modern mathematicians use the word “demonstration” rather than proof.
Unlike a proof, which points to some kind of ideal, Platonic form, a demonstration is simply a way of showing to someone else, another mathematician for example, that something is true. It is a convincing argument.
Yet, why make this distinction at all? If all mathematicians agree that an argument, such as infinite descent, shows that the square root of 2 is an irrational number, is it possible for it not to be certainly true? And if that is so, then why not call it a proof rather than a demonstration?
The problem isn’t so much that it isn’t true but rather than it is only true within the context in which all mathematicians exist. This is the context of literate human culture.
The concept of two as a number that exists apart from any set of objects, two loaves of bread, two coins, two trees, two lengths of a measuring rod, or two people, is a product of the invention of writing, which took place only about 6000 years ago. Writing allowed people to hold on to words long enough to dissect their language into abstract components. A number could now become a noun and not merely an adjective. Likewise, colors, flavors, and other describing words could take on their own abstract identity.
The meaning of the number two, thus, took on an existence that while connected to the objects it previously described could now be looked at in the general case and applied back to any set of two objects.
Now you may say that this abstraction was merely a discovery, not an invention. Just because we needed writing to discover it doesn’t mean it hasn’t existed since the beginning of time. But not everyone thinks that’s true. Indeed, far from being biologically motivated, one theory goes that numbers hijack our innate sense of quantity. This is why preliterate cultures do not have words for large numbers. Numbers themselves are not useful to them, only quantities and the ratios between numbers of objects. It took literate culture to come up with a way to distinguish two similar numbers like 1001 and 1002. Preliterate cultures would treat these as essentially just two large amounts.
So, suppose some catastrophe eliminated all literate human beings from the planet, would the square root of two still be irrational? The fact is that the statement itself, in the context of those cultures, would be without meaning. Although those cultures may understand what two is in reference to some quantity, the idea of there being a root and of rational number ratios needed to define what irrational means would have no basis in their language. It would be nonsense, as Wittgenstein says.
You might say that you would know it is true but since you don’t exist in that world it becomes like the tree falling in the forest. The square root of two no longer has meaning unless you place yourself there.
So from whom does that meaning come?
It must come from us, the mathematicians. It is only because of arrogance that we might believe that our way is the true way and that all other cultures are merely lacking.
Thus, the very concept of a theorem as a thing which exists apart from literate culture is much like a quantum particle existing apart from an observer. It is not something you can nail down counterfactually because it is smeared over the space of potential meanings. Truth happens because we observe it within the context of literate culture and appears to be absolute at that point. Whether anything is certainly true apart from any culture is ultimately a theological question.
Both perspectives are important and interesting, but I can't take the deep dive here on my phone. I hope to get back to the topic.
"Truth happens because we observe it within the context of literate culture and appears to be absolute at that point. Whether anything is certainly true apart from any culture is ultimately a theological question."
This article ignores a much richer way to come to the same conclusion because it points to the science of how animal brains compute. "literate culture" and "context" in that case do lead to a "theological question" because there is a truth to how human brains compute ("literately" meaning using our completely unique natural language, also subsuming "culture") and as a neural network predicating machine (with any two word concepts one can predicate ... contextualize .. another). That is how we think, and the only way we can ever know the world or reality. It is fundamentally statistically, not logically. That is not theological at all. As Descartes argued quite clearly, and correctly. You are kind of correct, but not being a good scientist about it.
With Descartes contextualization is the brain that is computing truth for itself. That is fundamentally as a statistical predication engine which includes ways to recognize more certain truth within a predication context (viz., inductive context). Gödel was similarly noting the same thing by staying within deductive logic (where more context can reverse truths, always).
The only real truth is that "you are" ("I think therefore I am"). That must be real. But because of how we "think" -- i.e., our brains and all animal nervous systems compute, we can then reach agreements as to empirical agreements and logical conclusions but as statistical predication engines.
So, as a rather uneducated human, you are right that truth is always relative to the people who agree to it with empirical agreement on context. And, because our brains compute using the mechanisms we now know they use (based on empirical work that you evidently are unaware of), we can also nearly perfectly understand how to understand the limits of our knowledge of reality.
Here is the truth of how our brains compute, and proof of it as well, such as we can possibly know it. This is the ultimate but simple algorithmic filter on truth.
https://medium.com/liecatcher/how-your-brain-computes-41ebe7428ff9
To repeat my comment. You are right but you missed the science of why and so decided it was something mystical, which it isn't. Math and Physics have blinded you like it has blinded many people to asking the right questions about truth. Descartes had it right. Within your narrow viewpoint truth is, I guess, theological. I just call it statistical.
Here is a better way to think about basic science in general, from physics to the fundamentals of language neuroscience.
https://medium.com/liecatcher/fusing-fusion-and-unfusing-fission-204aaff62de8
So God does play dice, as Einstein feared. But we are more sure of it for truth than physics is sure it for reality. By Descartes reasoning.
https://medium.com/liecatcher/god-doesnt-play-dice-once-but-twice-d59045fcec13