Optical and Quantum Computers. Developing Computer Hardware from the 1940s to the Present. A potted history of compter technology from chips and memory to speed and capacity. What will the future bring given recent developments?
The first electronic digital computer that we know of was created by John Vincent Atanasoff and his graduate student Clifford E. Berry in the early 1940s (no, it wasn’t the ENIAC a common misconception.) The first computers were based on relays and of course were limited in size.
The first really useful computers had to wait until the transistor was invented at Bell Laboratories on December 16, 1947 by William Shockley, John Bardeen and Walter Brattain and led to the SIMON (1949), GENIAC (1955) and Heathkit EC-1 (1959).
It wasn’t until the invention of micro-lithography techniques which led to the microprocessor that the first really powerful computers such as the PDP-8 (1965), the first graphics capable computer, the IMLAC PDS-1 (1970) and the first fully microprocessor based computer, the Intel SIM4 in 1970. The first computer that had a monitor, keyboard and mouse was the Xerox Alto from 1973.
Of course those early pioneers couldn’t have imagined where the computer would be by 2007. It is predicted that by the year 2010 that 22nm technology and crossbar connections will allow nine times the current transistor density per chip.
Of course the biggest problem will be cooling these huge numbers of transistors. This will allow chip speeds of 30 gigahertz, 6 to 7 times the current speeds. Of course this will also allow higher memory densities per chip with terabyte densities expected on a single chip.
All of this new technology points to computers with faster speeds and larger memories. New technologies such as Indium Phosphide-Silicon laser chips that use terabit-level optical “datapipes” inside the chips for communication rather than conductors will overcome the heat issues and drive processing speeds over the top.
In a recent demonstration a quantum computer that use the spin states of electrons for data storage was shown, again a technology that will drive speeds even higher.
It is doubtful that we will need to go beyond 64 bit address processing, however, someone out there is working on 128 bit addressing. The 64 bit addressing architecture allows the computer to directly access 2 raised to the 64th power bytes (16 pedabytes) in the user memory space, in contrast 32 bit addressing allowed 2 to the 32nd power or 4 gigabytes of addressable user memory space.
With some very rough back of the envelope calculations Rich Niemiec of TUSC determined that all of the worlds text, pictures, video and audio data could be stored in 12 pedabytes.
Probably the Achilles heel of modern computers is the disk technology we use to store the data. A disk can only spin at a maximum speed, that limit is about 15,000 RPM and disk actuators (the arms that move the heads to various locations) can only move so far, so fast. This limits the IO rates while technology pushes the density of data stored on disks higher and higher.
Of course as memory densities increase the use of solid state drives to replace hard disks will become commonplace. All of the operating systems will need to re-write the way they handle IO to handle the faster IO from solid state drives, disks will only be used for backing store when the system shuts down.
Chaos theory is a way to predict the behavior of chaotic systems such as weather; these newer, faster computers with larger memories will be able to finely hone weather predictions and other chaotic systems. A chaos-based database management system may be the ultimate data warehouse.
Some researchers are even suggesting that they will be able to develop algorithms that are self-aware and self-learning. To tell you the truth the idea of a machine with higher synapse density and more memory than a human mind that is self-aware and self-learning is a terrifying concept. Are we ready for true machine intelligence? A more significant question is, “Will they be ready for us?”