2. Bell Inequalities Revisited
In “On the Einstein–Podolsky–Rosen paradox” [
16] (Chapter 2), Bell purported to show that local hidden variables as proposed by Einstein, Podolsky and Rosen [
17] could not reproduce certain results as predicted by quantum mechanics. In particular, he imagined a pair of spin
particles,
, being created as an entangled pair and moving in opposite directions to a pair of detectors operated by two independent observers, A and B. Each is free to measure the spin component along an angle of their choice, independent of one another. Thus, observer
A measures
, while observer
B measures
. Bell then assumed the existence of a very general form of hidden variable, which he represented as
, which could represent a single variable or multiple variables, functions of one or more variables, deterministic or stochastic. Bell assumed that the measurement obtained by observer
A of
is a function of
and
, denoted
, and, likewise, for observer
B measuring
, denoted
. Bell further assumed that the result
A is independent of
, and that of
B is independent of
. Since we are dealing with spin measurements,
and since these particles are entangled, it follows that
.
Assume that the hidden variable,
, is an element of some measure space
, where
is the set of variables,
is a set of measurable subsets of
and
is a measure on
. The expectation value of the product of the two measurements,
, should then take the value
according to quantum mechanics.
However, there is a problem with this presumption of hidden variables since it is possible to form an inequality (for example, the CHSH inequality), which, it is argued, must be satisfied by any form of hidden variable, but which is violated by quantum mechanics. A standard argument for the CHSH inequality is to consider the function,
, of four variables,
where the range of values for each variable is
. In [
18], Shimony presents an argument derived from Mermin showing that this function must take values within
. Clearly, this is a linear function defined on the simplicial region,
. Being linear, it must take its extreme values on the boundary of the simplex, in particular, at the corners
. The formula may be rewritten in the form
. On the corners, the value of
must be either 0 or
, and a simple check shows that the value of
must be
, respectively. Hence the maximum value must be
.
Therefore, if there exists a probability measure,
, on the simplex,
, then
Here, the CHSH formula can be written as
and it follows immediately that
.
It is well known that in the setting of quantum mechanics, this inequality is, in fact, violated. So, how can this happen? The mathematics is quite clear and straightforward. It is simply the case that if one has a set of variables on the 4-dimensional simplex, , and forms the function , and there exists a suitable measure, , then the value of the integral will be bound to the interval . Since the form of the function is fixed, the only assumption that can be challenged is the existence of the measure, . Note that in the derivation above, there is no mention of causation, contextuality, non-commutativity or locality. The derivation is purely mathematical. The limitation arises because of the form of the function , where the individual terms associate the variables.
If the individual terms are allowed to disassociate, to be independent of one another, then the inequality can be violated. Consider the function on , where the variables are independent of one another. Then, this function clearly takes values in the range . Suppose that we attempt to force to take a value, . This will obviously occur if the first three terms are positive and the fourth term () is negative. Then, sign(a) = sign(b), sign(b) = sign() and sign(a) = sign() from the first three terms, which implies that sign() = sign(), so that the fourth term cannot be negative. Conversely, the function will take a value of −4 if and . In this case, sign(a) = −sign(b), sign(b) = −sign() and sign(a) = −sign() from the first three terms, which implies that sign(a) = −sign(b) = sign() = −sign(), so that the fourth term cannot be positive. Thus, cannot take these extremal values . The problem is that each term in is a product of two variables, effectively correlating or entangling them.
The only way to disentangle them is by introducing additional factors, and since we are interested in the integral of
over the simplex, one approach is to introduce different measures for different terms. This changes the identity of the random variables associated with the terms, echoing a point made by Dzhafarov [
19] that contextuality is about the identity of random variables. The assumption that classical random variables should always possess a joint distribution is just that - an assumption. It is not true generally. It has been known since the time of Kolmogorov that such assumptions are not universally valid [
20]. Vorob’ev made this quite explicit in 1962 [
21] when he determined the exact conditions that must be met in order for a joint probability measure to exist for a collection of random variables. His arguments are
entirely classical and lie within the framework of Kolmogorov probability theory. No appeal to non-Kolmogorov, quantum mechanical probabilities is necessary. It is simply a fact of both the classical and quantum worlds that joint probabilities need not exist for an arbitrary collection of random variables.
Dzhafarov has extended the usual Kolmogorov framework to, more formally, take into account the effect that context has on random variables. In the contextuality-by-default approach of Dhzafarov and Kujala [
22], each random variable is identified by the property,
q, that it measures and the context,
a, within which it is measured, and so, each random variable is denoted as
, and a collection of random variables can be organized in the form of a matrix. The CHSH situation can be viewed as a 4-cyclic system, that is, as a system of four random variables that can be arranged in an array having the following form:
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
Dzhafarov calls each row in the matrix a
bunch and each column a
connection. The random variables in each bunch have the same context and, thus, can be presumed to possess a joint distribution. The random variables in each connection have different contexts, and it cannot be assumed, a priori, that they possess a joint distribution. In most cases, they will not. If they do, they are called consistently connected. If they do not, they are called inconsistently connected. If we assume that the random variables lie in the range
, we may determine expectation values within each context, that is,
where
and
is the probability measure for the joint probability distribution for the random variables.
Consider the following formula:
If we ignore the context, we have
so this formula appears to suitably generalize the CHSH formula to include context.
If there is a joint probability measure,
, for
all of the random variables in the formula, which effectively means that context may be ignored, then the analysis for the CHSH formula applies to this formula as well (since they become equivalent as seen above), and so it follows that
In the event that a joint probability measure such as
does not exist, then it is incorrect to integrate over the function
as a whole. Instead, we must integrate each term separately, using the joint probability distribution appropriate for each pair of random variables. In this case, we obtain
This is the same as for the function
since each integral extremizes to
, so that
Let me emphasize again that there is no requirement for non-local influences or for quantum mechanical probabilities for this to be true. The only requirement is that the joint probabilities be contextual.
In the application of the CHSH inequality to the case of entangled particles, the random variables involved are consistently connected, whereas in most classical settings, they are inconsistently connected. One might assume, therefore, that consistently connected random variables, having the same probability measures across contexts, should, therefore, admit joint distributions across those contexts. In the next section, I shall present a simple example in a classical setting in which the random variables are consistently connected, but the system is still contextual, and a CHSH inequality is maximally violated. The case of consistent connectedness appears to violate our intuitions because having the same appearing random variables makes one assume that the same conditions underlying their generation pertain across contexts, and so a joint distribution formed of simple products holds across all contexts. This failure of intuition is not due to the presence of non-local influences but due to a failure to know the actual mechanism underlying the generation of the random variables. In such a case, we say that the marginal probabilities are degenerate since they do not specify a unique joint distribution but instead may arise from multiple, distinct joint distributions. If we do not know beforehand the correct joint distribution for the system under study, then we must correct for our ignorance.
In the case of a Bell situation, we have an entangled pair of particles described by a wave function of the form
From the vantage point of Observer 1, the marginal probabilities are determined by projecting onto each possible state, for example, for state
yielding a probability of 1/2 and likewise for the 1 state. Thus, the probability distribution for Observer 1 will be (1/2,1/2) and similarly for Observer 2. The joint probability is given by
| | |
| | |
| | |
These marginal probabilities are the same for the situation of two free particles, the wave function of which is
However, the joint probability in this case is
| | |
| | |
| | |
so that in the case of two entangled particles, the marginal probabilities associated with each particle, for each observer, are degenerate—they simply do not convey enough information about the mechanism underlying their generation.
When arbitrary joint distributions do not exist, we can say that the system exhibits Type I contextuality or, what Dzhafarov has termed, contextuality by default. Type II contextuality (what Dzhafarov has termed "true contextuality") can be detected using a generalization of the CHSH inequality. This inequality suffices when the random variables are consistently connected. When the random variables are inconsistently connected, then this Type I contextuality must be compensated for. Dzhafarov, Zhang and Kujala [
22] argued for a more general inequality, given here for cyclic systems of order
n. This inequality is
where
means the maximum taken over all combinations of terms, such that the number of minus terms is always odd. The first term is the CHSH term, the second compensates for the degree of cyclicity, while the third term compensates for inconsistent connectedness.
It is often asserted that Type II or true contextuality is unique to quantum mechanics. However, Dzhafarov and colleagues have demonstrated the existence of true contextuality in two experiments [
23,
24,
25], as have other authors [
26,
27,
28]. In addition, a simple thought experiment involving ice cream preferences shows the violation of even the Tsirelson’s bound under ideal conditions [
6]. In none of these cases are superluminal influences involved; they result from contextual effects. Several authors, most notably Khrennikov [
12,
13] and Dzhafarov [
14,
19,
29], have argued that the issue is not the presence of non-locality, but rather the presence of contextuality. As mentioned above, Dzhafarov has argued that the problem lies with the identity of random variables and their tendency to change in the presence of different contexts [
19]. Unfortunately, their arguments seem to be lost on mainstream physicists. The argument above is presented in the hopes that its transparency might put an end to the debate about non-locality and show that the issue is one of the presence or absence of contextuality, which is not unique to quantum mechanics, but which can occur classically as well. It is about the dynamics of the system, not its scale or some other arbitrary classical–quantum distinction [
7].
3. The Problem of Worldviews
In [
5], I argued that the failure to account for the presence of (implicit) biases derived from shared worldviews led to assertions concerning the non-existence of time. The main arguments in support of this position all suffer from one of three logical failures: begging the question, the fallacy of misplaced concreteness or the fallacy of misplaced omniscience. The tension in interpreting quantum mechanics arises, in part, from the tension between the more classical, objectivist worldview, whose central entities are objects, and the ideas of emergence, contextuality, non-commutativity and non-separateness, which are more in keeping with a processist worldview, whose central entities are processes (in the sense of Whitehead [
30]). In the processist worldview, entities are generated; they happen, they become. They exist for a time, then fade away. Objects, on the other hand, at least in ideal form, simply exist; they are eternal, as are their properties, at least until some interaction results in a change. Objects are ideal for study using mathematics and propositional logic since their entities of study are all ideal objects.
The core attributes of an object are the following:
It exists independent of any other entity—it can be isolated and treated as a whole unto itself.
It is eternal—it does not become, it merely is.
It is passive—it reacts, it does not act.
Its properties are intrinsic and non-contextual—they are fixed, complete and independent of the actions of any other entity.
Its motion is determined by fixed laws, which may be deterministic or stochastic (usually explained away as due to ignorance on the part of the observer).
Its motion is often attributed to variational principles—optimality, minimal and maximal—always extremized in some direction.
Its interactions with other objects are always local.
History is irrelevant—the future motion of an object depends only on its present state (and sometimes, not even that in the case of stochastic objects).
Processes are wholly unlike objects, although they may give rise to objects in particular circumstances. Whitehead considered process to be the generator of reality [
30]. Unlike in the objectivist worldview, which considers the entities of reality to simply exist, Whitehead considered becoming to be logically prior to being. In other words, the elements of reality do not simply exist; instead, they must come into existence through a process of becoming. Prior to becoming, they have no existence. After becoming, they exist briefly, following which they again fade from existence. Reality is a continual succession of becoming, being and fading away. The entities generated by processes are contingent and, thus, susceptible to contextual effects as a fundamental characteristic. The basic elements of reality that are generated by processes are termed actual occasions. These occasions have several characteristics:
Actual occasions are both ontological and epistemological (informational) in character.
Actual occasions are transient in nature. They arise, linger just long enough to pass their information on to the next generation of actual occasions and then fade away.
Process theory posits the existence of a transient now structured as a compound present: the current generation of actual occasions, generating process, and the next generation of actual occasions.
Actual occasions are holistic, discrete and finite, possessing a “fuzzy” extensionality.
Actual occasions are not directly observable. Only interactions among processes are discernable.
Observable physical entities are emergent upon actual occasions
Information propagates causally from prior to nascent actual occasions as a discrete wave.
Information from prior actual occasions is incorporated into nascent actual occasions through the act of prehension.
In the natural world, inanimate matter possesses the closest homology with the concept of object, and in [
5], I suggested that this is the principal reason why the language of mathematics has been so effective in the description of physical systems. It is, after all, a language of objects describing a world of object-like entities. Biological organisms, psychological phenomena, economies, cultures and languages all possess a much greater homology with the concept of process than they do with objects. In [
5], I suggested that many of the properties of quantum systems bear greater homology to the concept of process than that of object. Entangled systems are a case in point.
Entangled systems are difficult to conceptualize within an objectivist worldview, requiring leaps of faith or mental gymnastics, such as postulating the existence of undetectable instantaneous non-signaling influences, which, nevertheless, somehow send signals between particles, just not between observers. Measurement on an entangled system is difficult enough to understand if the observers of the two systems make their observations at staggered times, but what if their measurements occur simultaneously? In this case, the wave function of the preceding section is of little help. The wave function of the entangled pair is
If the measurement is simultaneous, what determines which of the two possible states the system is in? Is it random chance? Could the simultaneity not be exact, so that the winning observer causes a signal to pass to the system of the loser, indicating its winning choice? However, we assume that the measurements are indeed simultaneous. Is it possible for Observer 1 to find its system in state
, while Observer 2 could find its system in state
? However, that would result in a loss of the Bell correlations, and satisfaction of the CHSH inequality, since they would be essentially disentangled. So, this is not possible. There is a fourth possibility, however. That is, could it be that the state of the entangled system “twinkles” between
and
? That would ensure that at any moment of measurement, the system would become locked into whichever of the two states it temporarily finds itself, and this would persist as a result either of “wave function collapse” or the quantum Zeno effect. Nevertheless, undisturbed, the system would not be fixed in either state, so measurements would follow the statistics of the full wave function. This latter possibility is not possible within an objectivist worldview, which holds that properties of objects are intrinsic to the objects and, thus, persistent until forcibly altered. An objectivist interpretation requires the objects to possess all of their properties at all times–something we know cannot be true through variants of the Kochen–Specker theorem [
31]. We can throw the baby out with the bathwater and pretend that the systems are somehow unrealized until one of the observers makes a measurement, but that still does not really help if the measurements are simultaneous. Another possibility is to treat the systems as “potentia”, in accordance with the ideas of Shimony, Omnes, Griffiths, Gell-Mann and others within a consistent histories framework. This framework has much in common with Whitehead’s process theory, as described by Epperson [
32].
A simpler answer is to wholly adopt a processist worldview. In such a worldview, entangled systems are systems that are generated by a single process. They are not two systems that are entangled with one another; they are a single system that happens to be capable of manifesting two different measured values simultaneously. The two apparent systems have a common cause in the single process that generates them [
3,
15], and that suffices for them to be entangled. There is no magic, no spooky action at a distance, but there is process. Furthermore, process is not a spooky concept—we are surrounded by, embedded within, processes, and in fact, each one of us is a process. Processes, being generators of events, are subject to the conditions in the moment of their generating, and so, contextuality is not a peculiarity of the quantum realm, it is a fundamental characteristic of the realm of process. In my thesis from the University of Waterloo, I presented several different process models of NRQM which reproduce standard wave functions to a high degree of accuracy, showing that a process approach is wholly compatible with quantum mechanics.
The contingent nature of process suggests the need for a change in the logical system to be used to reason about such systems. Multi-valued, modal, fuzzy and intuitionistic logics seem particularly promising in this regard, and Gisin has already carried out some work in this area [
33,
34,
35]. Notions of reality, such as counterfactual determinism, all harken back to objectivist sensibilities at least—the idea that something can be real only if it manifests all of its properties all of the time. This is similar to the notion of realism proposed in the 1935 EPR paper [
17]: “If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity) the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity”. In [
3], I suggested a weaker form of realism, namely, that an entity could be said to be real if it can be shown to make a difference. An entity can, thus, be a generator of a property or dispose of a set of possible values of a property, without necessarily fixing the value of the property for all time, and still be considered real. This allows processes to be real, not merely objects.
Let us re-examine the CHSH inequality. In the EPR setting, it states that
In the setting of arbitrary random variables, it takes the form
Note that time plays no role in the above expressions. In the parlance of logic, these are propositional statements, which hold eternally, absolutely. This is a problem in both quantum and classical mechanics because observations of the values of random variables must take place over time, and it may be impossible to evaluate certain sets of random variables simultaneously. Non-commutativity of observables is an issue in classical just as in quantum mechanics. It has little to do with the mathematics of Hilbert spaces; it has to do with the complexity of entities (for example, try dissecting a cat first and exercising it later!). In the von Neumann formalism for measurement, time is not a necessary feature. One has an operator, O, that is applied to a wave function, , to yield a wave function of the form , where are eigenvalues of O and are eigenvectors of O (i.e., ). There is no explicit reference to time in this formalism. Nor is there in non-commutativity relations, such as . These are propositional relationships, which hold for all time. Again, these are hallmarks of an objectivist worldview. However, if these operators are considered to be applied at specific times, then the situation becomes much more complicated. Operators, indeed, measurements or actions more generally, which can occur simultaneously, are called commensurable. Incommensurable actions are quite common in the natural world. A set of commensurable random variables cannot be contextual because it would then be possible for at least one of the random variables to have two distinct forms—two distinct identities simultaneously—which is impossible. Likewise, a set of commensurable operators must be commutative; otherwise, there would be at least one pair, , where one could measure both AB and BA simultaneously, but they would not be equal. A setting of commensurable, commuting operators or commensurable random variables would not be contextual since they would clearly possess a joint distribution on account of their being measured simultaneous yields violations of the CHSH inequality and its variants.
Observations, let alone measurements, may not be made together or in certain orders, and thus, it is not at all clear in the above formula that these random variables refer to the same entities. They may refer to the same properties, but the nature of a random variable is determined by the probability measure associated with it, not merely but the set of its possible observed values. Declaring a formula such as those above requires making the assertion, or the assumption, that these random variables are the same random variables in each instance in the formula and, therefore, that a joint probability distribution for the whole collection exists. Time and order, thus, serve as fundamental markers of context in any real situation, and it must be shown, in advance, that these make no difference in the determination of the random variables and, thus, that context may be ignored. Otherwise, time and order must be explicitly noted.
If time and order must be noted, then they become part of the context, which must be associated with the random variable. Since some quantum mechanical and some classical systems violate the inequality, it follows that context
must be taken into account when evaluating the inequality. Thus, we cannot use the above form of inequality, which is context-free, and instead, we must, at the very least, use the form
These different contexts cannot, in general, be applied simultaneously so that time must be an implicit component of these contexts. Making the time component explicit, as in
shows that if the CHSH inequality is violated, so that the system must manifest different random variables, then this implies different distributions of values or simply different values at different times. This is not compatible with an objectivist worldview, in which properties are given a propositional quality and, thus, are expected to be enduring in the absence of interaction. This is compatible, however, with a processist worldview, in which a process merely disposes a system to express particular values for a property, for a moment, but not necessarily the same values for all moments. The actual value expressed at any given moment is, thus, a product of the process and the conditions that pertain at the time of determination; while this might be an unusual phenomenon in the world of inanimate entities, it is commonplace in the world of biological and psychological entities.
This shows that in situations in which the CHSH or other such inequalities are violated, entities must behave in a temporally transient manner, with properties exhibiting different values or distributions at different times. Thus, properties cannot be enduring; they must instead be fleeting or, at least, transient. Thus, at least some of the entities that form what we call reality cannot be enduring, i.e., they do not “shine”; instead, they must vary in some manner from duration to duration, i.e., they “twinkle”.
4. Classical Consistent Connectedness
In applying the CHSH inequality to the EPR setting, much is often made of the fact that the random variables as observed by the two observers take the same form, regardless of whether they are observing two free particles or an entangled pair. These observables are consistently connected, in the language of Dzhafarov, which means that in applying their generalized inequality to the EPR situation, the third term correcting for inconsistent connectedness vanishes, and one is left with the original CHSH inequality. The appearance of contextuality in the setting of consistent connectedness might be viewed as being unique to the quantum mechanical situation since, at first glance, it would seem to suggest, in the classical case, the presence of a joint distribution for the random variables. After all, their form is independent of context, so intuitively one might think that a joint distribution should exist. However, in the EPR case, one sees, as noted in the discussion of a previous section, that the distributions of states in the two cases are decidedly different. This is only made worse when spin measurements are taken into account.
In this section, I wish to present two examples of a classical situation, in which the relevant variables are consistently connected and there are no non-local influences, yet the situation nevertheless still exhibits true, Type II contextuality and, thus, exhibits a strong form of contextuality despite being classical. It is hoped that by presenting such an example it will dispel some of the mysticism that all too often creeps into discussions of quantum mechanical phenomena and drives home the point that a processist, or generative view of real entities, is compatible with both classical and quantum realms. The mysteries of quantum mechanics are more an expression of an implicit bias due to a limited worldview, rather than expressing a fundamental break from our usual experienced reality. The toy model to be considered is a two-player co-operative game. It is physically realizable. Each player is given a set of pieces that can be combined with a suitable piece from the other player to form a small structure similar to a miniature Eiffel Tower. These pieces are stacked in various ways to form a larger structure, but for the current purpose, the details of that procedure are not important. The pieces of Player I are semi-transparent and red in color. Each represents one-half of a tower, with equal numbers of left- and right-half pieces. Within each piece is a rotatable disk with a single hole near the circumference. The pieces of Player II are similar, except being blue in color, and instead of a hole, there is a small pin at a matching location on its disk. There is also an asymmetric base piece, shaped somewhat like an arrowhead. Furthermore, it is oriented left or right depending on the direction of its point.
Players alternate moves and, during each move, select a particular attribute: the orientation of piece P, orientation of base B, orientation of disk D, such that at the end of play, the two half pieces fit together to form a single tower, the disk is oriented with the hole/pin along a vertical plane such that the pin enters the hole (holding the pieces together) and the combined pieces are placed along the long axis of the base with a left piece on the left and a right piece on the right.
There are two observers. Observer I examines the contributions of Player I, while Observer II examines those of Player II. The pieces are semitransparent so that it is possible to examine the orientation of the internal disks without disrupting the structure itself. In this way, measurements may be carried out without disturbance, showing that the concept of “disturbance” is not germane to the issue of contextuality. Moreover, observations may be carried out independently of one another, and in any order, so that the non-commutativity of observations is irrelevant. However, there is a subtlety here as regards the relationship between observation and game play, which will be discussed below.
Each player is free to choose which attribute to select during a move. At the end of play, if the attributes do not suitably align, then the pieces are removed since they will not form a stable unit. Each player is presumed to adopt a particular strategy towards play, which helps to determine what move they should make in response to previous moves by both players, so as to form a stable unit at the end of play. Winning strategies are those that guarantee success, and I will only consider winning strategies here. Moves are conditional upon previous moves; while individual strategies are possible, a simpler approach is for each player to adopt a similar set of responses to prior game play. This makes describing these strategies much simpler. The following table spells out the move that either player should make based upon what has previously been chosen (by either player).
The columns refer to which attributes have already been chosen, by either player. The rows show the responses of the current player to those existing attributes. Each player is free to choose which move to make, but since the initial moves are the most important, games will be labeled by the first attribute choice of each player, , where can be one of and can be one of . We assign values as follows: left piece (1), right piece (−1), left direction of block (1), right direction of block (−1), disk hole up/pin up (1) and disk hole down/pin down (−1).
The initial moves are laid out in
Table 1. The columns refer to the initial move of Player I, while the rows refer to the subsequent initial move of Player II. The subsequent moves depend upon those initial moves, as described in
Table 2 and
Table 3. The rules associated with the choice of block orientation are more complicated and depend upon which player plays the block first. In the tables, B1 means that Player I played the block first, and conversely for B2.
The subsequent play depends upon which attributes have already been chosen and holds similarly for both players; thus, in
Table 2, only the attribute type is mentioned, as the player designation is irrelevant. For which attribute is selected, reference is made to any attributes previously selected by either player. The presence of an attribute of the same type takes precedence over the other attributes, marked in bold emphasis. Since one cannot make two base choices, a prior choice of
P takes precedence over a prior choice of
D for
B. The rules involving block play are more complicated as they depend upon which player played the block. They are presented in
Table 3.
As an example, suppose play is . Player I plays first and chooses their piece at random, say left. Player II then chooses the opposite orientation of the block, right. This ensures that when Player II chooses their piece, it is aligned with its block. Player I has only D to play. Since their P is congruent, it aligns with P1, and so, it is up. This forces Player II to pick up for its disk and right for its piece since Player I has no plays left.
Here, let there be two observers, I and II. Each observer is free to observe the orientation of any component throughout the structure. Observer I examines components of Player I, Observer II examines components of Player II. Observer I’s observables are, thus, , while those of Observer II are .
Using the above observables, we may form a 4-cyclic system as follows:
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
The context is chosen by analogy with the CHSH situation. In the case of measurements by the two observers in a CHSH experiment, the two observers fix their choices of measurements. This forces an interaction with the entangled pair, formally through projection operators onto the observable vectors. This, in essence, forces a projection of the wave functions and, hence, forces a particular dynamical evolution that is compatible with the observables being measured. Here, the choice of observables involves setting the initial choices of the players to be the observables in question, which ensures that they are manifested in the evolution of the game.
The probabilities for the various possibilities for the observables are as follows:
The marginal probabilities for each of the random variables may be determined from the above probability distributions, and it is obvious that
so that the 4-cyclic system is consistently connected. This means that the Dzhafarov inequality reduces to the usual CHSH inequality. A quick check shows that
In particular, the CHSH formula
takes its maximal value and violates the Tsirel’son bound of
.
5. Discussion
At the present moment, the consensus interpretation of the Bell inequality agrees with that of Bell himself that the violation of the Bell inequality (or here, the CHSH-type inequality) by certain quantum mechanical systems conclusively demonstrates that the existence of a set of local hidden variables underlying the quantum phenomena is impossible. Reality, at its most fundamental level, possesses a form of non-locality that permits the existence of spooky actions at a distance, influences that may pass instantaneously between certain quantum systems, in particular, entangled systems, something that Abner Shimony described as “passion at a distance” [
18]. These non-local influences are said not to transmit signals that would violate the special theory of relativity and, yet, are capable of informing a space-like separated particle of the form of observation being experienced by its entangled counterpart. Surely, though, the propagation of information constitutes a signal. It is unreasonable to believe that the bounds of special relativity apply only to the actions of human observers (or physicists). As noted above, a number of researchers [
12,
13,
14,
15] have questioned the validity of this interpretation of the violation of the Bell inequality, suggesting that the Bell inequality is not about non-locality but is, instead, about contextuality. The example given in this paper illustrates a classical system with only local information and consistent connectedness that admits random variables that violate the CHSH inequality and, in fact, violate the Dzhafarov inequality, maximally, and, thus, even violate the Tsirel’son bound. This demonstrates that non-locality is not essential for a violation of the CHSH inequality, it is contextuality. Non-locality is dramatic, contextuality is commonplace. However, it is contextuality that is fundamental here. The example demonstrates that contextuality and non-locality are not equivalent. This, of course, does not prove that non-local influences do not exist, but it does demonstrate that they are not necessary for the violation of the inequalities.
The fixation that physicists have upon non-locality appears to me to be due to an implicit bias towards interpreting classical situations within an objectivist worldview, in which classical entities are treated as objects that are enduring, with enduring properties (in the absence of interactions). That is, the view is that classical entities may be described in terms of logical propositions. This objectivist worldview might hold, with some validity, for inanimate entities, but as noted above, it does not hold for animate entities, especially organisms, which possess agency. A processist worldview, in which entities are generated and transient, whose properties are conditional, generated and contextual, provides much greater homology with observed characteristics.
The objectivist worldview comes with another feature that has long been appealing; that is, determinism. The assumption that entities are objects, with enduring propositional properties, makes the idea that everything is already fixed in place, as it were, appealing. Determinism, in the sense that complete knowledge of the present fixes the future (and when time reversible the past as well), has long been a feature of classical physical theory, going back to the time of Newton. Stochasticity does occur in classical physics, but it is understood to represent a lack of complete knowledge, so predictions cannot be made with complete accuracy but only with a measure of uncertainty. Quantum mechanics suggested the presence of fundamental non-determinism in reality, although the equations that govern this non-determinism are still deterministic, so determinism persists, just a step removed from the fundamental events. Those who cling to determinism within quantum mechanics are often forced to resort to elaborate mental gymnastics, such as the postulation of an unimaginable infinity of alternative universes, to explain away the fundamental uncertainty in quantum mechanics. A simpler explanation pertains if one shifts to a Processist worldview. There, non-determinism, or choice, is simply a fundamental feature of reality and of the entities that manifest within it. The probabilities that we observe are considered to be emergent from the underlying dynamics and interactions among these entities. In a Processist world, everything is contingent, and change is fundamental, so the idea that properties too can be dispositional and contingent is no longer so strange. Change need not be without order, however. Examples of this abound within the world of organisms and can be readily demonstrated as characteristic of organisms [
4,
5,
7]. The adoption of a Processist worldview brings coherence back to our understanding of reality, and, to my mind at least, remains in keeping with the principle of Ockham’s razor [
3].
Previous criticisms of the Bell inequality often resort to deep explorations of probability and measure theory and are not always easily accessible. The analysis of the Bell inequality present herein is simple, succinct, cogent and transparent, and I would hope so blatant that there should no longer be any question that what the Bell inequality shows is the following:
Theorem 1. If any system possesses a set of random variables whose expectation values lead to a violation of the CHSH (or, more generally, the Dzhafarov inequality), then those random variables must be contextual; in other words, the relevant joint probability distributions required to calculate the expectation values must be local to each expectation value.
Questions concerning how exactly this contextuality occurs for each individual system depend upon the particular dynamics of the system. It does not depend upon whether the system can be understood in classical or quantum mechanical terms. It does not depend upon whether there are, or are not, non-local influences. The examples presented in this paper of the two-player co-operative game show conclusively that a purely local, classical situation can exist, with consistently connected random variables, which nevertheless violates the CHSH inequality maximally and, thus, also violates the Tsirel’son bound. Neither classicality nor locality is involved in the violation of the inequality. What is necessary is that the probabilities associated with the calculations of the various expectation values be local, contextual. In order for this to occur, it is argued that the properties of the entities involved in such a situation not be propositional, in other words, not enduring throughout an interaction-free duration but rather generated, disposed, transient and conditional, determined by an underlying process as dispositions (or propensities, as Gisin suggests [
33,
34,
35]) but whose values are not fixed over time but allowed to vary depending upon local conditions. The models presented in [
3] (and the references therein) using the process algebra show that at least non-relativistic quantum mechanics can be accurately modeled using processes in which the fundamental entities are generated, rather than merely being. The fact that the marginal probabilities for the two observers in the Bell setting are the same as in the case of free particles does not imply that they are, in fact, free particles exchanging some instantaneous influence. Consistent connectedness implies marginal degeneracy—the marginal probabilities simply do not suffice to determine the joint probabilities that arise from the dynamics of the system. In the Bell situation, the entangled particles may appear as if they are two free particles when viewed from the vantage point of the two observers, but a third observer correlating the results with their observers, or the experimenter overseeing the process that generates them in the first place, understands that the underlying joint state, or generating process, is that of entanglement, not freeness.
One cannot truly understand a system without knowing and understanding its dynamics. Indeed, I argued in [
7] that what distinguished classical from quantum systems had nothing to do with the size, scale or number of subcomponents, but rather the structure of the interacting processes involved in its generation. Macroscopic systems can exhibit quantum behavior, microscopic systems can exhibit classical behavior—it all depends upon the structure of their dynamics. The two-player game presented here shows a classical-level system that, nevertheless, exhibits quantum-like features. In this case, these arise because the local properties manifested by the game depend upon the initial plays and, more generally, upon the entire history of play because they are
generated moment to moment over a duration [
5].
Thus, the deep understanding to be derived from the violation of the Bell inequality (and its variants) is that reality cannot, in general, take an enduring, propositional form. In simpler terms, reality cannot “shine”. Instead, the entities of reality that comprise any system capable of violating these inequalities must manifest properties that vary over time and over situations, even in the absence of any disturbing interactions. They must be contextual. Again, in simpler terms, reality must fluctuate over time and context; in other words, reality “twinkles”.