Science and Technology
Computer Science - computing processors, nanotechnology, quantum mechanics, quantum computers, data encryption
Working prototypes of quantum computers may be demonstrated by 2040, making a whole new range of computationally intensive tasks possible.
Simon Bone and Matias Castro of Imperial College, London offer this concise explanation of quantum computing in their work A Brief History of Quantum Computing:
'In the classical model of a computer, the most fundamental building block, the bit, can only exist in one of two distinct states, a 0 or a 1. In a quantum computer the rules are changed. Not only can a "quantum bit," usually referred to as a "qubit," exist in the classical 0 and 1 states, it can also be in a coherent superposition of both. When a qubit is in this state it can be thought of as existing in two universes, as a 0 in one universe and as a 1 in the other. An operation on such a qubit effectively acts on both values at the same time.'
Nobel physicist Richard Feynman and Charles Benett of IBM, among others, made significant early contributions to understanding the computational use of a quantum bit (qubit) from the physical properties of matter. Great progress is continuing worldwide at laboratories such as the Centre for Quantum Computation – a collaboration between Oxford and Cambridge Universities.
Implementation of quantum computing would make certain types of computation extremely fast -- potentially trillions of times faster than today -- and secure, using encryption techniques that are unbreakable because of the almost unimaginable number of instructions that can potentially be executed simultaneously. In quantum computing, a whole range of computationally intensive tasks that were previously impossible -- including image understanding, real-time speech recognition, generation of unbreakable codes, and extreme compression of data and media -- will become common.
What to Watch:
At A Glance:
21-50 years +