- THE FIRST MILLENIUM
1989 – UK
‘The World Wide Web – a hypertext-based graphical information system on the internet’
In the 1980s Berners-Lee, while working at CERN, the joint European particle physics lab in Geneva, developed a simple programming language that he called HTML, or Hypertext Markup Language. HTML contains simple codes
this text has
<strong>bold</strong> formatting )
that are used to format text and include graphics, audio and video. He also designed HTTP, or Hypertext Transfer Protocol, to move files across the internet and a system of addresses ( URLs, or Uniform Resource Locators ) to find files on the internet.
1854 – England
‘Logical operations can be expressed in mathematical symbols rather than words and can be solved in a manner similar to ordinary algebra’
Boole’s reasoning founded a new branch of mathematics. Boolean logic allows two or more results to be combined into a single outcome. This lies at the centre of microelectronics.
Boolean algebra has three main logical operations: NOT, AND, OR.
In NOT, for example, output is always the reverse of input. Thus NOT changes 1 to 0 and 0 to 1.
Boole’s first book ‘Mathematical Analysis of Logic’ was published in 1847 and presented the idea that logic was better handled by mathematics than metaphysics. His masterpiece ‘An Investigation into the Laws of Thought’ which laid the foundations of Boolean algebra was published in 1854.
Unhindered by previously determined systems of logic, Boole argued there was a close analogy between algebraic symbols and symbols that represent logical interactions. He also showed that you could separate symbols of quality from those of operation.
His system of analysis allowed processes to be broken up into a series of individual small steps, each involving some proposition that is either true or false.
At its simplest, take two proposals at a time and link them with an operator. By adding many steps, Boolean algebra can form complex decision trees that produce logical outcomes from a series of previously unrelated inputs.
1965 – USA
‘The number of transistors on a computer doubles every 18 months or so’
In 1965, one of the founders of chipmaker Intel observed the exponential growth in the number of transistors per silicon chip and made his prediction which is now generally referred to as Moore’s law.
In 1971 the first Intel chip, 4004, had 2300 transistors. In 1982 the number of transistors increased to 120,000 in the 286, in 1993 to 3.1 million in the Pentium and in 2000 to 42 million in the Pentium 4.
Heat production is now the limiting factor in the production of silicon chips with millions of transistors.
1937 – UK
‘A theoretical computer with two or more possible states; which can react to an input to produce an output’
Turing conceived the idea of a universal machine that employed an algorithm to solve a problem, writing the algorithm as a set of instructions using a standard code.
In 1950 Turing suggested that it must be possible to program computers to acquire human intelligence and devised the test suggesting that if the response from a computer is indistinguishable from that of a human, the computer could be said to be intelligent.
1947 – USA
‘The First Transistor’
Shockley was a member of the team at Bell Laboratories investigating the properties of electricity conducting crystals, focusing in particular on Germanium.
This research led to the development of the junction transistor, virtually invalidating the vacuüm tube overnight.
The transistor was developed in 1947 as a replacement for bulky vacuum tubes and mechanical relays. The invention revolutionized the world of electronics and became the basic building block upon which all modern computer technology rests.
In 1956, Bell Labs scientists William Shockley, John Bardeen and Walter Brattain shared the Nobel Prize in Physics for the transistor.
Shockley also founded Shockley Semiconductor in Mountain View, California — one of the first high-tech companies in what would later become known as Silicon Valley.