# Identity of indiscernibles

The identity of indiscernibles is an ontological principle that states that there cannot be separate objects or entities that have all their properties in common. That is, entities x and y are identical if every predicate possessed by x is also possessed by y and vice versa; to suppose two things indiscernible is to suppose the same thing under two names. It states that no two distinct things (such as snowflakes) can be exactly alike, but this is intended as a metaphysical principle rather than one of natural science. A related principle is the indiscernibility of identicals, discussed below.

A form of the principle is attributed to the German philosopher Gottfried Wilhelm Leibniz. It is one of his two great metaphysical principles, the other being the principle of sufficient reason. Both are famously used in his arguments with Newton and Clarke in the Leibniz–Clarke correspondence. Because of its association with Leibniz, the principle is sometimes known as Leibniz's law. (However, the term "Leibniz's Law" is also commonly used for the converse of the principle, the indiscernibility of identicals [described below], which is logically distinct and not to be confused with the identity of indiscernibles.)

Some philosophers have decided, however, that it is important to exclude certain predicates (or purported predicates) from the principle in order to avoid either triviality or contradiction. An example (detailed below) is the predicate that denotes whether an object is equal to x (often considered a valid predicate). As a consequence, there are a few different versions of the principle in the philosophical literature, of varying logical strength—and some of them are termed "the strong principle" or "the weak principle" by particular authors, in order to distinguish between them.[1]

Willard Van Orman Quine thought that the failure of substitutivity in intensional contexts (e.g., "Sally believes that p" or "It is necessarily the case that q") shows that modal logic is an impossible project.[2] Saul Kripke holds that this failure may be the result of the use of the disquotational principle implicit in these proofs, and not a failure of substitutivity as such.[3]

The identity of indiscernibles has been used to motivate notions of noncontextuality within quantum mechanics.

Associated with this principle is also the question as to whether it is a logical principle, or merely an empirical principle.

## Identity and indiscernibility

There are two principles here that must be distinguished (equivalent versions of each are given in the language of the predicate calculus).[1] Note that these are all second-order expressions. Neither of these principles can be expressed in first-order logic.

1. The indiscernibility of identicals
• For any x and y, if x is identical to y, then x and y have all the same properties.
${\displaystyle \forall x\,\forall y\,[x=y\rightarrow \forall P(Px\leftrightarrow Py)]}$
2. The identity of indiscernibles
• For any x and y, if x and y have all the same properties, then x is identical to y.
${\displaystyle \forall x\,\forall y\,[\forall P(Px\leftrightarrow Py)\rightarrow x=y]}$

Principle 1 doesn't entail reflexivity of = (or any other relation R substituted for it), but both properties together entail symmetry and transitivity (see proof box). Therefore, Principle 1 and reflexivity is sometimes used as a (second-order) axiomatization for the equality relation.

Principle 1 is taken to be a logical truth and (for the most part) uncontroversial.[1] Principle 2, on the other hand, is controversial; Max Black famously argued against it--but the argument is fatally flawed from the standpoint of finite mathematics and its established experimental support in modern physics (see Critique, below).

The above formulations are not satisfactory, however: the second principle should be read as having an implicit side-condition excluding any predicates that are equivalent (in some sense) to any of the following:[citation needed]

1. "is identical to x"
2. "is identical to y"
3. "is not identical to x"
4. "is not identical to y"

If all such predicates are included, then the second principle as formulated above can be trivially and uncontroversially shown to be a logical tautology: if x is non-identical to y, then there will always be a putative "property" that distinguishes them, namely "being identical to x".

On the other hand, it is incorrect to exclude all predicates that are materially equivalent (i.e., contingently equivalent) to one or more of the four given above. If this is done, the principle says that in a universe consisting of two non-identical objects, because all distinguishing predicates are materially equivalent to at least one of the four given above (in fact, they are each materially equivalent to two of them), the two non-identical objects are identical—which is a contradiction.

## Critique

### Symmetric universe

Max Black has argued against the identity of indiscernibles by counterexample. Notice that to show that the identity of indiscernibles is false, it is sufficient that one provides a model in which there are two distinct (numerically nonidentical) things that have all the same properties. He claimed that in a symmetric universe wherein only two symmetrical spheres exist, the two spheres are two distinct objects even though they have all their properties in common.[4]

Black's argument appears significant because it shows that even relational properties (properties specifying distances between objects in space-time) fail to distinguish two identical objects in a symmetrical universe. Per his argument, two objects are, and will remain, equidistant from the universe's plane of symmetry and each other. Even bringing in an external observer to label the two spheres distinctly does not solve the problem, because it violates the symmetry of the universe.

However, Black's argument is actually fallacious because the said model must be logically consistent--else a "counterexample" can be offered even against the principle that a counterexample can prove something false. Specifically, in any universe, there must be a finite minimum unit--a quantum, at which level the concepts of symmetry as well as asymmetry, become meaningless. Above this level, there is always asymmetry by at least that quantum.

Without a quantum level, we are left discussing when we have eliminated the last infinitesimal of asymmetry--corresponding directly to deciding if infinity is an odd or even number. By eliminating the quantum from his model universe, Black has eliminated his argument from the realm of logic.

This point appears to have been well understood by Leibniz, for even were Newton to be credited for originating all other aspects of The Calculus as we have it today, nonetheless Leibniz's introduction of differential notation is undisputed. Its importance beyond simplifying practice, is that it clarifies that differentiation and integration involve linear-limit generic quanta, not meaningless ${\displaystyle 0/0}$ division-like, nor ${\displaystyle \infty \times 0}$ multiplication-like, operations.

To clarify the implication of requiring a quantum level, consider our own physical universe in all its directly and indirectly perceived reality. Due to the curvature of space-time above the quantum level, Black's spherical objects are not identical due to their different locations, they do not have exactly the same properties--and in fact, neither can even be perfect spheres. At the quantum level, his two spheres can actually be indiscernible, but due to the entanglement implicit in quantum theory--they lose their individual identities into a single atomic object.

To summarize: Above the quantum level, Black's spheres are discernible because they are not identical. At the quantum level, they are indeed indiscernible, but identically one as well. Again, Black's hypothetical model is imaginary.

To be more specific, using an example given by Steven Weinberg in his Dreams of a Final Theory: Were one to place two chambers of volume V side-by-side, each with gas at pressure P--say oxygen and chlorine--and then open a window between them, the gases would diffuse into each other's chamber. With the proper mechanism placed in that window, one can actually extract energy from this mutual diffusion--up to ${\displaystyle {\sqrt {2}}}$PV, and one could run a real light bulb with this for that many watt-hours. Were both chambers to contain exactly the same gas--say oxygen--we would seem to have a real bonanza because no matter how much they diffuse into each other, we still have the original volume, pressure, and purity of the original gas species in each chamber. This would imply that we have found a limitless energy source. Unfortunately this is not what happens in reality.

When we (statistically) exchange oxygen molecule A for chlorine molecule B in the first case, we really do make an exchange. However, there are no independent oxygen molecules A and B in the second case--because their indiscernibilitity make them identical--one single object. In the hypothetical "mutual diffusion," nothing has been exchanged and so no energy has been released.

In fact, it appears that it was exactly such thinking as Black's that led to the various paradoxes of 19th century thermodynamics that were finally corrected by the quantum theory. Further, this appears to be the thinking of Einstein, Podolsky, and Rosen in their 1935 (EPR) paper[5] that challenged the foundations of quantum theory through a thought experiment. EPR, too, was proven to be incorrect--though in this case through demonstration by a set of analogous, actual experiments in the early 1970s, by John Clauser and Stuart Freedman of the University of California[6]--and by many research teams since.

As a final point here, the Aspect et al, experiments generated and measured circularly polarized photon pairs traveling in opposite directions. Properly obeying conservation of angular momentum--to which circular polarization corresponds--whenever a component of one photon's polarization value was measured at one location several meters from the point of origin, and the same component was measured in the one across the laboratory, they measured oppositely. However, which component was measured at a given station was random. This should have meant that only for the subset of cases of different component measurements should the correlations have proven statistically random. However, it turned out that only when all the data was looked at--including the cases of matched component measurements--were the measurements statistically random. The implication was that as predicted by quantum theory--a specific value of each photon's circular polarization did not exist prior to measurement--only the sum of zero for both did, and that when one was measured--causing its value, the other photon had to have the opposite value.

Since the photons travel in opposite direction, it would be as though they were Black's symmetric spheres coming out in opposite directions from the plane of symmetry with no independent property of spinning at all. Neither one spun in one direction nor the other (nor did they not spin)--yet their single whole had the property of "equal spin." However, from this mirror image standpoint, the instant this "perfect symmetry" space broke through a measurement "observer", both spheres would indeed be independently spinning--one way or the other--but whichever way, both in that same way. So whatever label the observer put on the one sphere--the other sphere already had the same label.

In short, during their "perfect symmetry" they were not only indiscernible, but since they shared a single spin property, yet neither one individually had any spin property, their existence was only as a single whole.

### Indiscernibility of identicals

As stated above, the principle of indiscernibility of identicals—that if two objects are in fact one and the same, they have all the same properties—is mostly uncontroversial. However, one famous application of the indiscernibility of identicals was by René Descartes in his Meditations on First Philosophy. Descartes concluded that he could not doubt the existence of himself (the famous cogito argument), but that he could doubt the existence of his body.

This argument is criticized by some modern philosophers on the grounds that it allegedly derives a conclusion about what is true from a premise about what people know. What people know or believe about an entity, they argue, is not really a characteristic of that entity. A response may be that the argument in the Meditations on First Philosophy is that the inability of Descartes to doubt the existence of his mind is part of his mind's essence. One may then argue that identical things should have identical essences.[7]

Numerous counterexamples are given to debunk Descartes' reasoning via reductio ad absurdum, such as the following argument based on a secret identity:

1. Entities x and y are identical if and only if any predicate possessed by x is also possessed by y and vice versa.
2. Clark Kent is Superman's secret identity; that is, they're the same person (identical) but people don't know this fact.
3. Lois Lane thinks that Clark Kent cannot fly.
4. Lois Lane thinks that Superman can fly.
5. Therefore Superman has a property that Clark Kent does not have, namely that Lois Lane thinks that he can fly.
6. Therefore, Superman is not identical to Clark Kent.[8]
7. Since in proposition 6 we come to a contradiction with proposition 2, we conclude that at least one of the premises is wrong. Either:
• Leibniz's law is wrong; or
• A person's knowledge about x is not a predicate of x; or
• The application of Leibniz's law is erroneous; the law is only applicable in cases of monadic, not polyadic, properties; or
• What people think about are not the actual objects themselves; or
• A person is capable of holding conflicting beliefs.
Any of which will undermine Descartes' argument.[3]