It's only appropriate that on the hundredth anniversary of Einstein's Miracle Year that NIST and MIT researchers have refined the proof of the great physicist's most famous theorum more precisely than ever before
In this blog, I usually talk about technical standards, created by engineers. But, as I’ve often pointed out in my other column at this site, Consider This…, there are many other types of standards as well, some of which are not developed (as such) by anyone at all, but rather are described from nature, as in the case of the laws of physics, or are pulled from the realm of pure abstraction, as with mathematics. Today, I not only want to talk about one of these types of non-developed standards, but about the most famous one of all: e= mc2. (Sorry, this blog doesn’t support superscripts.)
Unlike the standards that are developed by engineers to achieve predictable results, mathematical formulae and laws of physics are divined, appropriately, by mathematicians and physicists to help us understand the rules by which universe itself, for whatever reason (no, I won’t go there), has decided to operate.
It’s only appropriate that in this hundredth anniversary of Einstein’s Miracle Year that scientists at the National Institute of Science and Technology (NIST) and MIT have conducted the “most precise test ever” to confirm that E does, indeed, equal mc2. According to a NIST press release issued yesterday, the December issue of Nature includes a report that NIST and MIT scientists have confirmed that “E differs from mc2 by at most 0.0000004, or four-tenths of 1 part in 1 million. This result is “consistent with equality” and is 55 times more accurate than the previous best direct test of Einstein’s formula,” according to the paper.
The press release goes on to disclose that others have conducted more complicated tests that imply closer agreement between E and mc2 than the NIST/MIT work. However, “additional assumptions” are required to interpret their results, making these previous tests “arguably less direct.” The press release goes on to report that the two independent tests conducted by NIST and MIT each used:
…optical and X-ray interferometry—the study of interference patterns created by electromagnetic waves—to precisely determine the spacing of atoms in a silicon crystal, and for using such calibrated crystals to measure and establish more accurate standards for the very short wavelengths characteristic of highly energetic X-ray and gamma ray radiation.
The test was originally conceived thirty years ago by a NIST physicist, Richard Deslattes, who conducted the NIST test but regrettably did not live to see the Nature paper published.
Interestingly, while most people think of Einstein’s Special Theory of Relativity in the context of the time distortions that the great physicist predicted would be experienced by those traveling vast, interstellar distances at nearly the speed of light, the NIST/MIT experiments were conducted at the inverse end of the distance spectrum: the spaces between subatomic particles. Here’s the press release again:
The NIST/MIT tests focused on a well-known process: When the nucleus of an atom captures a neutron, energy is released as gamma ray radiation. The mass of the atom, which now has one extra neutron, is predicted to equal the mass of the original atom, plus the mass of a solitary neutron, minus a value called the neutron binding energy. The neutron binding energy is equal to the energy given off as gamma ray radiation, plus a small amount of energy released in the recoil motion of the nucleus. The gamma rays in this process have wavelengths of less than a picometer, a million times smaller than visible light, and are diffracted or bent by the atoms in the calibrated crystals at a particular energy-dependent angle.
Well, enough of the technical side of thing, although those that are into crystal lattices may want to buy a copy of Nature, where they can learn (at least in overview) how to catch two ions in a “small electromagnetic trap” (the press release does not say how small a trap must be to be considered “small”, but one would assume from its intended captives that the trap would be really small). If you don’t get around to buying the issue, I should at least tell you that each team of scientists then “counted the revolutions per second made by each ion around the magnetic field lines within the trap,” using an impressively gleaming, but prosaicly named, instrument dubbed GAMS4. I don’t want to give away all of the secrets, but apparently the trick is to use ions of two different elements (in this case, sulfur and silicon).
Does any of this actually matter to you or to me? Well, yes. Because we live in the intermediate distances between the subatomic and the interstellar, and incredibly enough (at least to me), Einstein also predicted that as one flies higher, relativity kicks in, and time slows — and slows enough that if airliner navigational instruments didn’t compensate for the time lag, they’d be miles out of their reckoning by the time they reached (or not, as the case may be) their destinations.
If you’re in to this sort of scientrivia, you might want to sign up for NIST’s bi-weekly email bulletin, called TechBeat. If you check out the current issue, you’ll be invited to welcome in 2006 this New Year’s Eve by adding a leap second. As I recall from a prior story, that’s because the Earth got a bit pokier as a result of last December’s disastrous Tsunami in the Indian Ocean.
And therein lies the difference man-made standards and scientific formulae, and between the length of the day and e= mc2. I mean, the former ain’t bad, but as TechBeat points out, it’s not like you can set your watch by it.
subscribe to the free Consortium Standards Bulletin (and remember to Buy Your Books at Biff’s)