Book contents
- Frontmatter
- Contents
- Contributors
- Editor’s acknowledgements
- Introduction: The new physics for the Twenty-First Century
- I Matter and the Universe
- II Quantum matter
- III Quanta in action
- 10 Essential quantum entanglement
- 11 Quanta, ciphers, and computers
- 12 Small-scale structures and “nanoscience”
- IV Calculation and computation
- V Science in action
- Index
- References
11 - Quanta, ciphers, and computers
Published online by Cambridge University Press: 05 June 2014
- Frontmatter
- Contents
- Contributors
- Editor’s acknowledgements
- Introduction: The new physics for the Twenty-First Century
- I Matter and the Universe
- II Quantum matter
- III Quanta in action
- 10 Essential quantum entanglement
- 11 Quanta, ciphers, and computers
- 12 Small-scale structures and “nanoscience”
- IV Calculation and computation
- V Science in action
- Index
- References
Summary
Introduction
Computation is an operation on symbols. We tend to perceive symbols as abstract entities, such as numbers or letters from a given alphabet. However, symbols are always represented by selected properties of physical objects. The binary string
10011011010011010110101101
may represent an abstract concept, such as the number 40 711 597, but the binary symbols 0 and 1 have also a physical existence of their own. It could be ink on paper (this is most likely to be how you see them when you are reading these words), glowing pixels on a computer screen (this is how I see them now when I am writing these words), or different charges or voltages (this is how my word processor sees them). If symbols are physical objects and if computation is an operation on symbols then computation is a physical process. Thus any computation can be viewed in terms of physical experiments, which produce outputs that depend on initial preparations called inputs. This sentence may sound very innocuous but its consequences are anything but trivial!
On the atomic scale matter obeys the rules of quantum mechanics, which are quite different from the classical rules that determine the properties of conventional computers. Today’s advanced lithographic techniques can etch logic gates and wires less than a micrometer across onto the surfaces of silicon chips. Soon they will yield even smaller parts and inevitably reach a point at which logic gates are so small that they are made out of only a handful of atoms. So, if computers are to become smaller in the future, new, quantum technology must replace or supplement what we have now. The point is, however, that quantum technology can offer much more than cramming more and more bits onto silicon and multiplying the clock-speed of microprocessors. It can support an entirely new kind of computation, known as quantum computation, with qualitatively new algorithms based on quantum principles.
- Type
- Chapter
- Information
- The New PhysicsFor the Twenty-First Century, pp. 268 - 283Publisher: Cambridge University PressPrint publication year: 2006