At the heart
of the history of quantum entanglement lies a famous debate between
two groups of physicists, a clever paradox and an iconoclastic way out of
it.
God does
not play dice with the universe. He plays an ineffable game of his own
devising, which might be compared, from the perspective of any of the other
players, to being involved in an obscure and complex version of poker in a
pitch dark room, with blank cards, for infinite stakes, with a dealer who won’t
tell you the rules, and who smiles all the time.
– Terry Pratchett
In June 2017, a group of scientists in China announced that they
had used the country’s Micius satellite, launched a year earlier, to teleport
information from Earth to space in an instant. In other words, they had
moved it across over 500 km in literally no time. To achieve this,
they had relied on a natural phenomenon called quantum entanglement. The name
itself correctly suggests that it belongs in the realm of quantum mechanics,
the realm of subatomic particles. The Chinese scientists’ experiment had bested
a previous record, when in 2012 their leader himself had lead a team
that had teleported information across 97 km.
Very few ideas in science enjoy the popularity that
teleportation does: it has been equally awe-inspiring among scientists and
laypeople. To the more inspired, what is fascinating is not how
an object “leaves” one point in space and “arrives” at another but that it
traverses the intervening distance in an instant. The implications of such
travel are significant at first sight. The day when we will be able to
“beam” a person up and down across space – a la Star Trek – might still be very far
away but in the meantime we could use quantum entanglement to, for example,
teleport digital security keys between two computers and prevent most forms of
eavesdropping by hackers.
In the earlier experiment, Jian-Wei Pan, a physics professor at
the University of Science and Technology of China, Hefei, and his colleagues
used quantum entanglement to teleport information across Qinghai Lake in
the country’s west. Using an ultraviolet laser pointed at a barium crystal,
Pan’s team generated pairs of entangled photons. Each photon of a pair was
transmitted using a telescope to two parties on either sides of the lake.
The nature of quantum entanglement
Let’s call the parties A and B.
Making a measurement on the photons yields a good
description of the state the
photons are in. It refers to the values of a few fixed variables. If the
variables have a particular combination of values, then the system is
said to be in a particular state. States are usually independent of extrinsic
properties like mass. So, A’s and B’s goals are to see if a third party
interacting with these photons ends up in a state similar to the control group
even when separated by 97 km of free-space.
To measure this, the researchers at A let photons generated
locally – i.e. at A itself – to interact with the incoming modified photons in
a fixed, predictable way. This changed state is then measured and compared with
the state of the photons at B. Pan & co. found that the states of the
modified photons at A and those of the unmodified photons at B were the same
80% of the time.
What is wonderful is that the particles didn’t have to end up with the
same state. Eighty per cent is a value large enough to rule out any
coincidence. This long-distance “communication” between minuscule, fragile
particles is proof that their pre-travel entanglement was durable and
resulted in a predictability of state that let the particles behave similarly
in two very different measurement experiments.
The precise nature of this entanglement, which Albert
Einstein called “spooky action at a distance”, that baffles most scientists.
When two groups of photons are said to be quantum-entangled, it means that the
states that the groups are in are related to each other by means of a variable.
If the variable changes, then the properties of the photons change, too.
However, how the groups themselves are related to each other does not
change.
The existence of this variable is not as much disputed as it is
hoped into existence. We haven’t found it yet – assuming it
exists. And because it remains outside the realm of human control,
experiments with teleportation tend to leave this variable alone and instead
focus on how much the measurement sites can be separated by, how efficiently
large molecules can be entangled, etc. That is, they stick to testing
its limits.
To do this, the photons are subjected to a simplified treatment,
one conceived with the fewest assumptions as well as the
fewest sources of error. Instead of groups of photons, physicists address
them two at a time. The state that each half of this pair can exist is in is
defined thus. Let’s say the two particles are ‘a’ and ‘b’
and the states are ‘0’ and ‘1’. The four possible combinations of states then
are:
{0, 0}
{0, 1}
{1, 0}
{1, 1}
{0, 1}
{1, 0}
{1, 1}
Entanglement is said to have occurred when b is in a particular
state when a is
in a particular state. That is, if b is
1 every time a is 0,
then a and b could be entangled.
Since this property is commutative, a will
be 0 every time b is
1 as well. Further, the change occurs instantaneously irrespective of the
distance between the two particles, giving the impression that they’re
“communicating” at a speed faster than that of light. The presence of such an
order, together with the four possible outcomes, makes each outcome a
particular state of
the system. These states are called Bell states, named for the
Scottish physicist John Stuart Bell.
To find out what the current Bell state of a particle is, a Bell
measurement is made. However, Heisenberg’s uncertainty principle, however,
messes this up: the principles dictates that the act of making the measurement
will change the state of the system. This is how, for example, the
principle prohibits us from knowing an electron’s position and momentum at the
same time. However, this alteration does not matter as long as the
pre-measurement state is observed and recorded. In Pan’s experiment, with six
initial possible states, the Bell measurement was made not by a direct
observation per se but by observing how the local and incoming photons
interacted.
Earlier, another experiment had been conducted that demonstrated
the teleportation of quantum information across 16 km. The principal
shortcoming of that experiment was that the photons to be teleported had been
specially generated within the lab under careful conditions. In practice, this
is a highly ideal condition that could make it difficult to be used as
‘everyday technology’. Pan and his colleagues had eliminated this
necessity in their 97-km experiment by generating local photons with random
states.
Between the classical and the quantum
The history of quantum entanglement is as entertaining as
teleportation itself is. At its heart lies a furious debate between two
groups of physicists, a clever paradox and an iconoclastic way out of it.
To ease into it, consider an experiment. Imagine two
devices separated by a large distance. These are devices that receive inputs
and spit out results. There are two kinds of inputs: classical inputs,
which are governed by classical physics, and quantum-mechanical inputs, defined
by the rules of quantum mechanics. An input is generated by a common
source and is delivered to the devices in an instant.
Now, a pair of inputs is generated at the source such that
each input may instruct the device to yield a result ‘x’ or ‘y’. The device called A reads
the instructions and yields a result, A*. The device called B reads the
instructions and yields a result, B*. If A* and B* are in the same
state, then they may be said to be entangled. To have achieved this, A and
B – the devices that yielded them – must have communicated in some way to, if
nothing else, come to an ‘agreement’. Alternatively, they could have been in
possession of some information since before the
observation phase.
If it so happened that A and B communicated instantaneously –
i.e., exchanged information at faster than the speed of light – then they may
be said to be entangled. Let’s remember that, in a quantum mechanical context,
the results are found to be identical only after they
are observed. Thus, ‘the act of observing the result’ also
participates in the measurement process.
This is because Heisenberg’s uncertainty principle kicks in when
the particles are observed. When we make the measurement, we are changing the
value of some state variable of the particle, so it is the final state that we end
up observing. Bell was the first to make this observation and added that the
act of observation was somehow tied in with quantum entanglement. In fact, he
concluded that the results were entangled in some way because of the act
of observing.
Now, the act of observing is a classical phenomenon because
the devices A and B that enable the measurement are classical devices. That
said, Bell argued that this is where the line between classical mechanics and
quantum mechanics blurred. He wrote in 1971:
Theoretical
physicists live in a classical world, looking out into a quantum-mechanical
world. The latter we describe only subjectively, in terms of procedures and
results in our classical domain. … Now nobody knows just where the boundary
between the classical and the quantum domain is situated. … More plausible to me
is that we will find that there is no boundary. The wave functions would prove
to be a provisional or incomplete description of the quantum-mechanical part.
It is this possibility, of a homogeneous account of the world, which is for me
the chief motivation of the study of the so-called “hidden variable”
possibility.
That we often call quantum mechanics ‘quirky’ is because it
allows things like entanglement to occur. However, the people who first
noticed that this was possible were also hoping to use it to make the
point that quantum mechanics could not be a true theory of nature. They were
Einstein, Boris Podolsky and Nathan Rosen, commonly referred to as EPR.
The principal target of their ire was the wave function, a
mathematical function that adherents of quantum mechanics thought could
describe the properties of a quantum mechanical entity, like a particle. For
example, by ‘solving’ a wave function, physicists could elicit some of a
particle’s states. While a wave function could ‘encode’ a particle, the particle
itself could not influence its own wave function. Physicists also believed that
each wave function depended on the whole configuration of the universe.
According to EPR, these properties, among others, meant that any interpretation
of quantum mechanics that included the wave function would allow Heisenberg’s
uncertainty principle to be violated.
The EPR paradox
In 1935, the trio published a paper describing a paradox – a
phenomenon – that has since been called quantum entanglement. EPR tried to
refute quantum mechanics by showing up the flaws of quantum entanglement
(objects that are entangled share the same wave function). In their paper, they
argued that, since entanglement occurred only on conjugate entities – particles
that are somehow, but surely, paired – then the measurement of one of the
A*-state variables should have rendered the corresponding state variable
in B* indeterminate (because of the uncertainty principle). However,
entanglement has already been
observed. This means that either the two particles should have communicated or
that they should have had the information necessary to generate the same
outcome.
EPR preferred the latter explanation, asserting that some
“hidden local variable” was responsible for controlling the outcome of the ‘act
of observing’. They had made two assumptions to come to this
conclusion: locality and realism. The principle
of locality states that an object is affected directly only by its immediate
surroundings, not by an event that is occurring a large distance away and at
the same time. Realism is the ability to assume the existence of objects and
parameters even when they have not been observed. Together, they made for a
classical way to explain a quantum mechanical effect, and so remove one of
the features that made quantum mechanics weird and make it more palatable
to Einstein. After all, it was he who had asserted “god does not play dice with
the universe” in response to quantum mechanics’ whimsy. (E.g., we can’t know
the state of a particle before observing it, so it could be in any state,
including in both states at once).
In 1964, Bell proposed a now-famous theorem that
refuted the EPR paradox’s preferred explanation. He observed that any local realist theories are
incompatible with quantum mechanics. Essentially, this means that since a great
number of experiments agree with the predictions of quantum mechanics, and
since many of the results are stronger than to be explicable by just local
hidden variables, either locality or realism is in conflict with quantum
mechanics. Specifically, in his theorem, Bell had posited that locality
had been violated and that faster-than-light communication was happening.
Bell’s hypothesis was based on the de Broglie-Bohm theory
(initially rejected because of Bohm’s support for communism),
which interpreted quantum mechanical effects as being caused by the wave
function. This, we now understand, immediately requires that the principle of
locality be violated (because a wave function was influenced by the entire
universe). We also see that teleportation (of quantum information) is an
instance of non-locality because it implies instantaneous communication. If two
particles can communicate faster than at the speed of light to replicate
quantum mechanical effects, then perhaps complex objects can someday be
replicated instantaneously across large distances by simultaneously reproducing
the quantum states of the particles associated with the object.
Of course, such a possibility is hinged on Bell’s theorem being
true and on the EPR paradox’s implied existence of locality being false. To
date, numerous experiments have been conducted that have neither conclusively validated
nor invalidated Bell’s theorem. Reactions to the theorem itself have
ranged from apathetic to celebratory, with one physicist stating, “Anybody who’s not bothered by Bell’s
theorem has to have rocks in his head.” The difficulty lies in what it implied
for the real world: it made quantum mechanics and local realism mutually exclusive. Either
quantum mechanics was falling short of explaining some physical parameters
or superluminal information transfer was happening. (Bell told BBC in 1985 that
if the latter is to be disallowed, then we should assume the more-disconcerting
notion that there is no such thing as free-will in the universe.)
If looking behind the curtain kills some of the fantasy,
that is not the case with teleportation at least. Entanglement continues to
elude understanding, and simplifying something so enigmatic to problems
in linear algebra – as we have seen – is simply not enough to make sense
of whatever is allowing it. With their paper in 2012, Pan and his team
were sitting pretty at the forefront of quantum mechanical teleportation –
as they are today in 2017. Even if we still have a long way go, the knowledge
of Pan’s experiments have given us the best shot at ultimately achieving
the teleportation of more sophisticated information systems. But as Bell
and EPR have helped elucidate, what they have achieved may be awesome but
it brings with it an implication that many of us continue to find difficult to
accept.
0 comments: