Physicists detect an Aharonov-Bohm effect for gravity

An opinion that particles can feel the influence of potentials even without being exposed to a force field may seem counterintuitive, but it has long been accepted in physics thanks to experimental demonstrations involving electromagnetic interactions. Now physicists in the US have shown that this so-called Aharonov-Bohm effect also holds true for a much weaker force: gravity. The physicists based their conclusion on the behavior of freefalling atomic wave packets, and they say the result suggests a new way of measuring Newton’s gravitational constant with far greater precision than was previously possible.

Image Credits and description: A quantum probe for gravity: Physicists have detected a tiny phase shift in atomic wave packets due to gravity-induced relativistic time dilation – an example of the Aharonov-Bohm effect in action. (Courtesy: Shutterstock/Evgenia Fux)

Yakir Aharonov and David Bohm proposed the effect that now bears their name in 1959, arguing that while classical potentials have no physical reality apart from the fields they represent, the same is not true in the quantum world. To make their case, the pair proposed a thought experiment in which an electron beam in a superposition of two wave packets is exposed to a time-varying electrical potential (but no field) when passing through a pair of metal tubes. They argued that the potential would introduce a phase difference between the wave packets and therefore lead to a measurable physical effect – a set of interference fringes – when the wave packets are recombined.

Seeking a gravitational counterpart

In the latest research, Mark Kasevich and colleagues at Stanford University show that the same effect also holds true for gravity. The platform for their experiment is an atom interferometer, which uses a series of laser pulses to split, guide and recombine atomic wave packets. The interference from these wave packets then reveals any change in the relative phase experienced along the two arms.

The Stanford team prepared a cloud of ultracold rubidium-87 atoms and used a pattern of overlapping laser beams (known as an optical lattice) to fire it up a 10 m-long vertical vacuum tube. A laser beam splitter then separated each atom’s wave packet into an upper and lower trajectory, with the former passing close to a semicircular ring of exceptionally pure (and therefore unmagnetic) tungsten weighing 1.25 kg and placed at the top of the tube.

The idea was to detect a tiny phase shift due to time dilation– the fact that two clocks at different heights in a gravitational potential will tick at slightly different rates. This phase shift is only measurable if the separation between wave packets is significantly larger than the distance between the closest interferometer arm and the tungsten source mass. As such, the researchers used beam splitters that transferred lots of momentum to the wave packets while spacing them as far apart as possible – up to 25 cm, compared with 7.5 cm for the wave packets that got closest to the tungsten.

Observing this effect, however, also required the physicists to account for the phase shift due to the gravitational tug of the source mass (the force field). They did this by also firing atomic clouds along interferometers with much more closely spaced arms, such that the wave packet separation in this case – 2 cm – was generally small compared to the distance of the tungsten mass, and therefore insensitive to the time dilation.

An extra effect

The researchers ran the experiment repeatedly, each time varying the minimum distance between the upper arm of the interferometer and the source mass. Plotting the variation of phase difference between the two arms with arm-mass distance, they found that the resulting curve for the interferometers with closely-spaced arms matched expectations for shifts due solely to deflections of the wave packets by the gravitational field. But that wasn’t so for the interferometer with widely-spaced arms. In this case, something other than the field itself had introduced phase shifts.

Kasevich and colleagues interpret this “something else” as relativistic time dilation, and therefore evidence of Aharonov-Bohm phase shifts. “These results show that gravity creates Aharonov-Bohm phase shifts analogous to those produced by electromagnetic interactions,” they write.

The researchers note that the phase shifts they observed are proportional to the mass of the atoms, in accord with predictions from theory. What’s more, these phase shifts depend on both Planck’s constant and Newton’s gravitational constant, G. As such, the researchers suggest that by precisely characterizing the source mass, this type of interferometer could be used to improve the measurement of G – the value of which is known far less accurately than any other fundamental constant.

In a piece written to accompany a paper in Science on the research, Albert Roura of the German Aerospace Centre in Ulm cautions that such experiments will have to overcome the unwanted effect of gradients in the gravitational field, as they make the phase shifts very sensitive to the atomic wave packets’ initial position and velocity. But he reckons that this problem can be overcome thanks to a technique he developed that gets around the fundamental limitations on position and momentum accuracy imposed by Heisenberg’s uncertainty principle. “The prospects for improved measurements of Newton’s gravitational constant based on atom interferometry are therefore very promising,” he concludes.

Guglielmo Maria Tino of the University of Florence in Italy is also upbeat about future research. He says that the latest results show “the potential of atomic quantum sensors as new tools to help us understand gravity and its relation to quantum physics”.

References: Science,

Post a Comment

Previous Post Next Post