There are several positions on the objective reality of math, "out there", in the objective universe separate from human experience. The tension is this: math seems "real" in that it works, and is useful, and yet, it is very unlike other things in the world. Is it "real" or a human creation? It's commonly taken that there are three positions to take when considering the reality of math:
- Platonism (or realism) – mathematical objects are real but abstract and outside of time and space, and our ability to apprehend them in the manifest world reflects this.
- Nominalism – math is empirically derived from our experience of the world. (This is agnostic to the question of whether abstract mathematical objects exist.)
- Fictionalism (or formalism) – math is a useful trick, employed by entities with specifically-constructed nervous systems; it is ultimately a language game that we use to make sense of the world. Some fictionalists would say this means that math is false, while others say this means it is meaningless, and/or that the property of truth or falsehood does not apply to math.
While typically these accounts seek to address the reality of mathematical objects generally, here I'm only interested in one. Indeed, in Wigner's words, mathematics is indeed unreasonably effective if fictionalism were true for all of math. I hope this is part of a continuing discussion about the reality of mathematical objects in general, and the meaning of their being a different kind of entities, which I address at the end of the post.
- Numbers are derived from the logic of Peano arithmetic. Peano arithmetic assumes the existence of the unit; that is, by positing the successor function, it assumes that the unit exists. The idea of the "fiction of the unit" is at the core of this claim about integers being invented - even in the formal construction of numbers, the unit is just assumed.
- While not all numbers are integers, we do use a system of integers to represent the digits of all numbers, including transcendental ones. In fact, most numbers are not just irrational, but transcendental, which cannot be constructed algebraically. This means that most numbers cannot even in principal be accurately represented by integers (non-transcendental irrational numbers can be algebraically constructed using integers, transcendental numbers cannot.) At least some of the numbers that describe nature like pi and e are transcendental. It's worth pointing out that the inability to represent transcendental numbers with integers is not merely a trivial artifact of their infinite length - despite not being able to represent pi, the Kolmogorov complexity of pi is very much finite.
- Historically, each time there is an innovation in numbers, it is first rejected as an absurd fiction, then accepted as a useful tool. This is well-illustrated by the history of negative numbers and then irrational numbers. This is exactly the pattern we should expect if in fact numbers are a useful fiction - except for integers, since our nervous systems had this quirk built in, and we never had to produce them effortfully in a rule-based system; we never had to get past the stage of disbelief.
- Over a century ago, set and number theory encountered difficulties, most famously with Goedel's theorem. - "one cannot prove consistency within any consistent axiomatic system rich enough to include classical arithmetic": which is to say, integers necessitate inconsistency. When asked what consequence we should expect in the real world from integers being a convenient fiction, this is the answer - for millennia we used units without trouble but as soon as we attempted to ground them in logic around the turn of the nineteenth to twentieth centuries, we ran into difficulties.
- You might object to my initial claim that if our nervous systems apprehend a thing, it must be real. This is incorrect, or rather it's not clear enough about what "real" means in this context. No, it's not that numbers are a hallucination - even if you hallucinate a dog, this doesn't disprove the existence of real dogs; if you hallucinate an abstract color pattern, this doesn't disprove the existence of colors. Rather, numbers are a kind of thing that exists only as subjective experience. We experience pleasure, pain, and emotions. They are subjectively real, they correspond to events in the world outside our nervous system, but cannot exist separate from a nervous system. Integers are the same category of entities.
- Expanding on instinctive human knowledge of whole numbers - it's a universal that for millennia humans have numbering systems with units (usually one, two, and many) without any formal grounding. Some animals (including non-primate ones, e.g. crows) can count at least this high. Note that biology having evolved representational tissues (nervous systems) capable of representing mathematical objects is certainly not an argument against the reality of those entities - indeed, if we live in a universe where there are abstract entities unidirectionally causing things in the physical universe, you would expect some of those rules to appear in the brains of replicators that evolve in that universe, who are will be imbued with some innate instrumental math sense for survival. The way we do this is using neurons which convert the input of continuous reality into a discrete fire/don't fire output. This is why the world seems to have discrete, countable objects when it is actually composed entirely of fuzzy gradients. Therefore, animals do not even need language to delude ourselves into believing in integers - this is where the "conceit" of integers is rooted, not in some philosophical mistake our ancestors made, but in the basic mechanism of how we perceive the world. (To extend integers indefinitely, we do need language.)
- If the claim that integers are an invention of the human brain is correct, then we would product prediction is that intelligent aliens would have a mathematics that did not use integers. Of course there are no aliens to talk to about this, but we have the next best thing: increasingly sophisticated computer systems. In at least two cases, computers have developed mathematics without integers as a primitive concept. In the first case I'm aware of, Wolfram Alpha "rediscovered" math without creating integer. Later, computer scientists showed how large language models use trigonometry to do addition. There are intriguing hints that even in our nervous systems, integers are not primitive but rather an adaptation of spatial reasoning. Gerstmann syndrome results in a stroke in (typically) the left parietal lobe; symptoms include left-right confusion and loss of arithmetic abilities. In OCD, there can be a sensitivity to both orientation (lines and angles) as well as certain numbers. In second-language speakers, otherwise-fluent speakers often must still resort back to their native language for both numbers and directions.
- I've deliberately left this point at the end of the list as it is the most metaphysical eyebrow-raising. If there is no such thing as a unit, only gradients, then there are no boundaries in the universe, and everything is the same object. This argument has even been applied to subatomic particles (see the one-electron theory.) If everything influences everything else, there are only gradients, and in reality there is only one unity (the universe as a whole) - then the idea of a unit beyond the universe itself is logically inconsistent. The unit of the universe would not actually be a unit either - if the only binary is existence and non-existence, then there is only one thing, and nothing that is not that thing. With no boundaries, it is meaningless to talk about a "unit".
Initially I set out to show that math itself was a provincial tool of human cognition, and shrank my argument to just integers. And I do think based on the observations here that there is good reason to believe that integers are not a coherent mathematical object in the same sense as other abstract math and logic concepts. But I'm not making a claim about other objects, even about transcendental numbers (which is most numbers.) But I have not attacked abstract objects and have no suspicion that they are invented in the same way that I think integers are. It seems to me that if those of us who would call ourselves materialists in the philosophical sense, believe in both the material world, and the reality of at least some mathematical and logical objects - objects which appear eternal, unchanging outside of space and time as it were - we are clearly dualists. Obviously this shouldn't sit right with us, which is what motivated my interest. And here I am, left in the surprising position of agreeing with the Platonists, just quibbling over which types of mathematical objects that have meaning outside of a human nervous system.
The possible solutions both seem incredible:
- True Platonic dualism. These objects exist outside space and time but are real because they affect us, but they are causally asymmetric. It is this causal asymmetry that is key to their fundamentally different nature. They are a true uncaused cause, a true prime mover. Mathematical objects affect the material world, but the material world cannot affect them.
- Platonic monism. These objects are real, but are not causally asymmetric. They can change, and there is some kind of feedback from the universe to math and logic. It is worth recognizing that the question of whether physical constants change over time and space remains an open one, with recent evidence in its favor (See for instance Murphy et al 2003 for evidence of change to the fine-structure constant over time, which has stood up to scrutiny for a quarter century at this point.) But this is more fundamental: how can the nature of a triangle change? Or the nature of identity? The spatial dimensions? One interesting solution to Platonic monism is that advanced by Tegmark, who argues the "hyper-Platonist" monist argument that there are only mathematical objects. (Notably, Tegmark is the senior author on the LLM paper I linked to above.)