Computing Time

A brief history of computing from Stelios Mores with a Little Einstein experiment for young scientists


Humans evolved the ability to count early on in their history such that by 44,000 years ago the people living in the southern regions of Africa were marking bones with notches to keep track of the days in the lunar cycle.

By 4000 BC, farmers and merchants living in the Zagros region of present-day Iran were using small clay tokens to keep track of trade. In time these were replaced with impressions on clay tablets such that by 3100 BC many of the Sumerian communities living around Mesopotamia (Iraq) had their own ways of writing numerals which depicted the items being counted. In time this counting system became a more general one with the markings no longer depicting what was being counted. It was the first true generic system for counting and used a base of sixty (sexagesimal) as opposed to the more familiar base of ten (decimal) commonly used today. It was around the same time that the first counting machines appeared which resembled the abacus.

The invention of differential gears around 1000 BC by the ancient Chinese laid the foundations of analogue computing. During the 3rd century BC the Greek mathematician Archimedes introduced this to his concept of balance to solve mathematical relationships, leading to inventions such as the astrolabe which was used for navigation, and the Antikythera mechanism which was used to predict the dates of astronomical phenomena as well as the dates for the Olympic Games. At around the same time Aristotle developed the Organon, a logical map which formalised reason and logic.


By the late 13th century the philosopher Ramon Llull from Majorca created the first thought experiments which inferred logical conclusions, whilst Arab engineers had invented machines such as automated flute players and clockwork marionettes. This led to the development of various calculating machines during the mid to late 17th century by a number of European mathematicians such as Gottfried Leibniz and Blaise Pascal. Around the same time the Scottish physicist John Napier developed logarithms and his contemporary Edmund Gunter used these to develop the slide rule which expedited the solution of complex mathematical and engineering problems in a practical and affordable manner bringing about a revolution in the development of computational instruments. The 1700s saw the Italian Giovanni Poleni develop his 'calculating clock' such that calculating machines began to match the capabilities of the slide rule. The early 19th century saw the English mathematician Ada Lovelace introduce her concept of a programmable computing machine. Charles Babbage used this to build the Difference Engine which was designed to calculate astronomical tables. It was soon followed by the design of the Analytical Engine which was the first general purpose computer and which established the architecture of the modern computer. It comprised an Arithmetic Logic Unit which could add numbers together, an Instruction Pointer which allowed instructions to be carried out in sequence and an internal Scratch Memory for storing information. The mechanical complexity of the Analytical Engine however was such that a full working version has never been built.

difference engine.jpg

The 20th century brought about the next significant developments in computing. During the 1930s Charles Wynn-Williams introduced electronics to computing which led to the use of switching circuits for solving Boolean arithmetic by Akira Nakashima (the OFF state representing a zero and the ON state a one), and John Atanasoff and Clifford Berry of Iowa State College had built the first electronic computer to solve a specific mathematical problem. The 1940s saw the German engineer Konrad Zuse fabricate the first programmable computer, the Z3 (limited to a finite number of steps) and the English mathematician Alan Turing took this further by visualising the Turing-machine which was capable of executing an infinite number of programming tasks defined by a series of instructions called an algorithm. This concept of the general purpose electronic computer was realised in the form of ENIAC in 1946 and was soon followed by the Manchester Baby in 1948 which could electronically store a programme. By the 1950s the advent of the transistor both improved the functionality and reduced the form factor of computers significantly as they became more commonly used in solving engineering and scientific problems. Integrated circuits (ICs) were developed in the early to mid 1960s, with Masatoshi Shima designing the first multi-chip computer processor units (CPUs). Single-chip microprocessor based computers were soon to follow as Marcian Hoff and Federico Faggin developed the Intel 4004 in 1970.

Gordon Moore, who was CEO of Intel in 1965, made an observation which has since become known as Moore's law and is true to this day. He suggested that as new ICs were developed, the number of transistors doubled in density every couple of years, so doubling the computing power of a microprocessor. As such progress in computing has followed this trend, with the Intel 8008 of the late 1970s and the Motorola 68000 of the 1980s leading to the release of the Apple Macintosh and the IBM PC, setting the standard for business computing globally. The 1900s saw the advent of the Supercomputers consisting of hundreds of microprocessors working in unison, and more recently multi-core microprocessors such as the Intel Xeon with processing speeds measured in gigahertz and data storage capacities measured in gigabytes (billion bytes) have brought the power of supercomputers to the desktop.

This latest decade however has seen the beginning of new technological advances in computing based on the use of quantum mechanics with the potential of improving present calculation speeds by a hundred million times by simultaneously solving problems using quantum states that do not require each step of a calculation to be done sequentially but in one instant instead. These speeds are truly phenomenal to the point that researchers at the Moscow Institute of Physics, the Swiss Federal Institute of Technology and the University of Chicago have recently managed to reverse time by one entire second within the quantum state environment of a quantum computer reducing the overall time a calculation would take to below that which classic physics would allow. Is this the start of computers really saving us time?

Einstein Cartoon - Blue.jpgLittle Einstein’s Corner - Slide Rule
A slide rule is not a ruler down which you can slide things. It is made up of two rulers that can be used to calculate with. The simplest type of slide rule is used for addition and subtraction. To make your own slide rule you will need:
1. An A4 size piece of card
2. A pair of scissors
3. A ruler
Cut two strips from the cardboard and mark them as shown in the image, marking one TOP and the other BOTTOM. On the TOP ruler add an arrow at the zero. Now to add two numbers, point the arrow on the TOP ruler to the first value on the BOTTOM ruler, then find the second value on the TOP ruler. Now read the value directly below this on the BOTTOM ruler. You have added the two numbers.

Slide Rule.jpg

Recent Articles