Flip Yr Bit

September 24, 1997


Well, well. Could it be that Albert Einstein was wrong about the nature of the universe and Gene Roddenberry was right?

I am sitting in the office of Dr. Samuel Lomonaco, professor of computer science and electrical engineering at the University of Maryland Baltimore County and coauthor of "Aspects of Entangled Translucent Eavesdropping in Quantum Cryptography," an article to be published this December in Physical Review A, a scientific journal. Lomonaco is trying to explain the basics of quantum computing, a field so far out on the theoretical cusp of computer science and physics that only baby steps have been made towards building a working model.

If an actual quantum computer is ever built, Lomonaco says, it will constitute a revolution in computing. Such a machine will be so fast as to be "mind-boggling." Lomonaco's on the cautiously optimistic side of a hot debate in physics on whether quantum computing is possible. Naysayers such as Pierre and Marie Curie University professors Serge Haroche and Jean-Michel Raimond argue, in the August 1996 Physics Today, that quantum computing will remain an impossibility unless "unforeseen new physics is discovered."

Certainly, the trouble with today's computers, as any good Quake player knows, is that they're too darn slow. Sure, microchips are stuffed with millions of transistors, but they converse in a language of two words: one and zero. No matter how fast classical computers become, they'll be little more than speedy abacuses-so woefully linear, so dreadfully Newtonian.

Now physicists are pondering how to use the mysterious multidimensionality of quantum physics to redefine computing-specifically, how to harness the strange phenomena of superposition. Lomonaco refers to Gilles Brassard's January 1997 Science magazine article "Searching a Quantum Phone Book" to explain the concept. Brassard writes that if we shine a beam of light on an atom that has a single electron in its outermost orbit, that electron will jump to a higher orbit. Pretty basic stuff. But guess what happens if a light shines on an atom for only half the time? You might figure its single electron wouldn't jump at all. Or it'd jump half the time.

What actually happens is far weirder: The electron exists in both places at the same time. Strange but true-ask your local physicist. That an electron (or an ion or photon) can be in multiple places at once means a quantum computer can hold exponentially more information than a classical computer, because each bit, now called a qubit, can hold more data. Think of the computer on your desktop, Lomonaco enthuses, as having a nonillion times more processing power than it does now (a nonillion is a 1 followed by 30 zeros). So what could we possibly do with all that power? Lomonaco struggles for an example. You could predict the weather for the next six months, he offers. You could play millions of video games at once. That, of course, is a fairly silly answer, and Lomonaco doesn't seem too happy with it, but the reality is, we haven't a clue what we will harness such power for. But we've never had any problems using up processor power in the past.

Although famed physicist Richard Feynman proposed the development of quantum computing in 1981, only recently has such work been actively pursued. Last year, the Defense Advanced Research Project Agency sank $5 million into investigating the matter. The National Institute for Standards and Technology has constructed a rudimentary quantum logic gate, as has the California Institute of Technology, whose model fills an entire room.

That's pretty huge, considering that a single logic gate in today's computer is microscopic in size. But researchers are hoping that from big things, small things will one day come.

Getting beyond that rudimentary scale, however, is another question entirely. Encoding a few qubits may be easy; keeping them intact, or doing anything useful with large numbers of them, is trickier. Haroche and Raimond, among others, have pointed out how decoherence-the collapsing of quantum data into one state or another-can happen very easily; the larger the data set, the more likely the decoherence. Even reading encoded qubits will cause them to lose their qubitness.

But as dilemmas surface with new knowledge, so do solutions. Lomonaco refers to that another phenomena-called entanglement-might come to the rescue. It's been long known that if two entangled photons (two photons with opposite spins that exist together as a distinct unit) are sent to different corners of the universe, what happens to one affects the other, irregardless of distance. Like the electron getting excited by the beam of light, both photons occupy both possible spins until measured, and--here's where it gets really weird--once one is measured, the other settles to the opposite spin instantaneously! They are known to be entangled. Whatever the signal between the two, it's moving faster than the speed of light. Einstein never gave up the idea that nothing could move faster than the speed of light--and physicists are still divided if this constitutes proof of anything truly "moving" faster than the speed of light (but if so, the only thing silly about Roddenberry's fictional starship Enterprise galloping through galaxies at warp speeds 4, 5,or 6 times the speed of light, is how slow she was going). That physicists haven’t figured out what that signal is yet didn’t stop AT&T researcher Peter Shor from exploiting the entangled nature of qubits as a method of extracting data.

Lomonaco points out that quantum computing is in a similar state as classical computing was a half-century or so ago-the working models weak and bulky, the governing ideas still theoretical. But this doesn't mean the problems are surmountable. The pursuit of quantum computing could be the modern-day equivalent of medieval alchemy: Dreams of an unlimited precious resource-gold back then, processing power now-dazzle our best minds with a promise as unattainable as it is irresistible. It will be an interesting field to watch.
--Joab Jackson



[ The archive || [E-mail]