The first computers were actually people, who were tasked with calculating mathematical problems by the use of tables. Their results were often incorrect because of table errors and human errors: a situation which has hopefully been corrected on modern computers!. Nowadays a COMPUTER is usually defined as an apparatus that can perform various tasks by following a pre-defined PROGRAM of instructions.

Chales BabbageThe earliest machine attributed with qualifying this criteria was the Analytical Engine, conceived by Charles Babbage (1791-1871). The design of this 19th Century machine was based on amazingly intricate mechanics. A typical example of Victorian engineering, it was to have been made from thousands of brass cogs and gears. Unlike Babbage's earlier Difference Engines which required the operator to crank them over by hand to calculate polynomials, it would have used a typical Victorian power source to carry out computations. Frustrated at the many errors he found while examining calculations for the Royal Astronomical Society, Babbage declared, "I wish to God these calculations had been performed by steam!" and with this his plan to create the first automated computer was instigated. However, the grand Analytical Engine was never actually built.  


Babbage's Difference Engine

Babbage's Difference Engine recontructed at the Science Museum, London

Had it been completed, Babbage's grand design would have consisting of over 50,000 components. The mechanical ingenuity of the machine's design included sections which can be compared to the functions of modern computers, which were quite amazingly constructed almost entireley from mechanical parts! These included the specification of a central processing area, called "the mill", that would have allowed instructions to be processed in any sequence, a memory called "the store", which could hold 1,000 numbers of up to 50 decimal digits and output devices to produce printed results.

Augusta Ada King

Augusta Ada King, Countess of Lovelace (1815-1842) and daughter of English poet Lord Byron, helped Babbage immensiely with this ambitious project, by negociating financial assistance from the British government. Lady Lovelace's understanding of the Engine's design was probably eqivalent to Babbage's , which allowed her to communicate the complextities to the public. She may also be counted as the first computer programmer, as she also created instruction routines to be fed into the Analytical Engine (in the 1980's, the U.S.Defense Department named a programming language ADA in her honor).

Joseph Marie Jacquard

The programs for carrying out virtually any mathematical function on Babbage's Analytical Engine were to be supplied on punched paper cards, a technology borrowed from Joseph Marie Jacquard (1752-1834) the French inventor of a punch card weaving loom of the same period.

Jacquard Loom

The Jacquard punch card idea was also used by an American inventor, Herman Hollerith (1860-1929), to find a faster way to compute the U.S. census in 1889. The authorities feared that expanding population would mean that the counting of the census would take up to 10 years: the previous one in 1880 had taken nearly seven years to complete! However, technology triumphed and the census results were compiled in just six weeks. Hollerith went on to found the Tabulating Machine Company in 1896, bringing punch card readers to the business world. Following a series of mergers, this company became International Business Machines (IBM) in 1924.

Much work then focussed on bringing to computing, the mid-19th century work of George Boole (1815-1864), who perfected a binary system of algebra, allowing any mathematical equation to be represented  by simply true or false logic statements. The goal was to extend this concept to electrical circuits, where a voltage could represent the binary digits (bits) of 1 and 0, by simply being on or off. In this way, electrical logic circuits could be built to use Boolean algebra and combined to form an electric computer.

Konrad ZuseThe leap into electrically operated computers was not made until the next century, when in 1938, the German, Konrad Zuse constructed a massive machine from electromagnetic telephone relays in his parents front room!. He also  succeeded in using two logical voltage levels ( off and on ) combined with binary numbering and thus laid down many of the foundations of future designs. Surprisingly, his work was halted by the Nazi regime with the onset of the second world war.


Clifford Berry with ABC

In pre-war America, an electrical computer was also envisaged by John V. Atanasoff, a professor at Iowa State College and his protege, Clifford Berry. In 1939, Atansoff constructed a small prototype to test his ideas but the project to build the full Atansoff-Berry Computer (ABC) was cancelled because of WW2. The unfinished computer used 300 vacuum tubes to perform calculations, capacitors to store binary data, and punched cards to communicate input/output.




Alan TuringPioneering developments are also attributed to Alan Turing. Having written a paper in 1936 describing a hypothetical device known as the Turing machine, which was to perform logical operations on an infinite paper tape, Turing went on to develop the Colossus machines at Bletchley Park, England. Colossus was the world's first electronic valve computer, as it was built in 1943 and was operational in February 1944 (two years before the American ENIAC, described below). Unfortunatley, the secrecy of the work at Bletchley Park, which was a British project to decipher coded German High Command messages during World War II, led to lack of recognition for Turin's visionary achievments.

Colosssus

The success of the Colossus, led to the building of ten more Mk II models, which were all housed in a hut called "H Block" in 1943. This hut, which could rightly be called the world’s first computer complex, is still standing in Bletchley Park. Churchill orded all the machines destroyed when the war ended and the operators were made to vow secrecy for the rest of their lives. However, work has recently been undertaken to reconstruct the Colusus at Bletchley Park, which has now become a museum.

The Colossus machines also used optical reader systems for the paper tapes, punched by the many military female teletype operators. This allowed data intercepted from transmissions encrypted on the German Enigma devices to be input and processed at very great speed. This very important application of computers was one of the first to prove that the technology could be used for tasks involving characters as well as numbers (ALPHA-NUMERIC).

Bletchley Park

Bletchley Park, England, site of the the world’s first electronic valve computer

Alan Turing was a somewhat enigmatic character himself and his research into the relationships between machines and nature created the field of artificial intelligence. He wrote a paper in 1950 describing what is now known as the Turing Test. The test consisted of a person asking questions via keyboard to both a person and an intelligent machine, hidden from view. He believed that if the person could not tell the machine apart form the person after a reasonable amount of time, the machine was "somewhat intelligent". It is said that Turing's ultimate goal was to create a thinking machine by combining biology with with mathematics. However, he died on 7 June, 1954 having eaten an apple soaked in potassium cyanide: reports say he was homosexual and took his own life to avoid embarrassment.

In America, the Harvard-IBM Automatic Sequence Controlled Calculator, ( Mark I for short!), was produced by a Harvard engineer named Howard H. Aiken (1900-1973) with IBM, to create ballistic charts for the U.S. Navy before 1944. This was also an electronic relay computer, rather like that of Konrad Zuse, constructed with something like 500 miles of wiring and was about half as long as a football field!


ENIACThe next generation of post-war electronic computers were constructed from logic elements using valves (vacuum tubes) rather than transistors: the first example actually being the Colussus, which was still surrounded by secrecy. The ENIAC (Electronic Numerical Integrator and Computer), was the next example, which was developed by Presper Eckert and John Mauchly  at the University of Pennsylvania for the U.S. Army to calculate tables for shell trajectories in 1946. The programming of this machine was extremely complicated as it involved consulting the wiring diagrams and then setting row upon row of switches to alter the way in which the logic worked. ENIAC was truely a huge device, constructed from 18,000 valves, 70,000 resistors and 5 million soldered joints. It's electrical power requirments were so emense, approximately 160 kilowatts, it is reported to have caused the lights to dim in an entire area of Philadelphia!


Freddie Williams

Things were improved greatly on this front by Freddie Williams, who designed a machine at Manchester University , England, that stored programs and executed them electronically, rather than computing a result from a hard-coded or switched program. The Small-Scale Experimental Machine, known as SSEM, or the "Baby", was designed and built at the University and made its first successful run of a program on June 21st 1948. It was the first machine that had all the components now classically regarded as characteristic of the basic computer.

Manchester Mark1



From this small-scale experimental machine a more powerful machine was designed and built, the Manchester Mark 1, which by April 1949 was generally available for computation in scientific research in the University. With the integration of a high speed magnetic drum by the Autumn (the ancestor of today's disc) this was the first machine with a fast electronic and magnetic two-level store.





John von NeumannIn the mid-1940's John von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC), which also used the "stored memory" technique.

as well as the "conditional control transfer," that allowed the computer to be stopped at any point and then resumed, allowed for greater versatility in computer programming. The key element to the von Neumann architecture was the central processing unit, which allowed all computer functions to be coordinated through a single source.During 1936 through 1938 Alan Turing was a graduate student in the Department of Mathematics at Princeton and did his dissertation under Alonzo Church. Von Neumann invited Turing to stay on at the Institute as his assistant but he preferred to return to Cambridge; a year later Turing was involved in war work at Bletchley Park. This visit occurred shortly after Turing's publication of his 1934 paper "On Computable Numbers with an Application to the Entscheidungs-problem" which involved the concepts of logical design and the universal machine. It must be concluded that von Neumann knew of Turing's ideas, though whether he applied them to the design of the IAS Machine ten years later is questionable. Von Neumann's interest in computers differed from that of his peers by his quickly perceiving the application of computers to applied mathematics for specific problems, rather than their mere application to the development of tables. During the war, von Neumann's expertise in hydrodynamics, ballistics, meteorology, game theory, and statistics, was put to good use in several projects. This work led him to consider the use of mechanical devices for computation, and although the stories about von Neumann imply that his first computer encounter was with the ENIAC, in fact it was with Howard Aiken's Harvard Mark I (ASCC) calculator. His correspondence in 1944 shows his interest with the work of not only Aiken but also the electromechanical relay computers of George Stibitz, and the work by Jan Schilt at the Watson Scientific Computing Laboratory at Columbia University. By the latter years of World War II von Neumann was playing the part of an executive management consultant, serving on several national committees, applying his amazing ability to rapidly see through problems to their solutions. Through this means he was also a conduit between groups of scientists who were otherwise shielded from each other by the requirements of secrecy. He brought together the needs of the Los Alamos National Laboratory (and the Manhattan Project) with the capabilities of firstly the engineers at the Moore School of Electrical Engineering who were building the ENIAC, a

Another missing person is Grace Murray Hopper. Dr. Hopper was perhaps the first modern woman to be involved in computers (Ada King, Countess of Lovelace possibly being the first in the 19th century). She started work for Howard Aiken in 1943 on the Harvard Mark I Calculator (also called the IBM ASCC). Sunsequently she became deeply involved in the development of high level languages for computers, creating the concept of a compiler, and two early languages. She was highly influential in the development of COBOL and its usage in military installations. She became the highest ranking female Navy person of her time (Rear Admiral) and a role model to thousands of young women. She is perhaps best known for her discovery of the first computer bug in the Harvard Mark II computer. The bug now resides at the National Museum of American History in Washington DC.

The valve technology still in use in the 1950's meant that the most advanced computers of the time, such as the American UNIVAC (Universal All-purpose Computer) and the English LEO (Lyons Electronic Office), occupied several rooms. Their circuit boards typically only provided a single flip-flop! Early memory designs also decreed that the computers of the day would be bulky because each storage cell consisted of a torroidal ferrite bead, which could be magnetised to store a bit. The residual magnetism, or hysteresys, of this CORE memory meant that the whole storage was non-volatile to the extent that processing could be continued immediatley from where it was stopped when the power was switched off.

The advent of solid state electronics allowed a subsequent process of integration and minaturistion which made the microchip computer possible. Firstly, the semi-conductor transistor replaced the valve in logic circuits and with advances in magnetic-core memory by around 1956, early supercomputers such as the Sperry-Rand LARC (Livermore Atomic Research Computer) appeared. LARC, as is obvious from the name

These computers, both developed for atomic energy laboratories, could handle an enormous amount of data, a capability much in demand by atomic scientists. The machines were costly, however, and tended to be too powerful for the business sector's computing needs, thereby limiting their attractiveness. Only two LARCs were ever installed: one in the Lawrence Radiation Labs in Livermore, California, for which the computer was named and the other at the U.S. Navy Research and Development Center in Washington, D.C.Second generation computers replaced machine language with assembly language, allowing abbreviated programming codes to replace long, difficult binary codes.

Apollo Computer

then advances in production techniques lead to the planar transistor. The invention of the integrated circuit (IC), which combined many transistors on a single wafer of Silicon, was only four years old when the Apollo Moon programme was announced and pressures from this and the Minuteman II missile project fuelled the future of the technology.

Incentives to save size and weight at almost any cost, forced a size reduction factor of 1000 in a period of less than a decade. In 1969 Gorden Moore and Bob Noyce now of Intel , found a way of storing data with Silicon semiconductor technology that lead to the production of the DRAM and SRAM. This naturally meant that memory devices would undergo a similar reduction in size and associated increase in storage density.

By the end of the late 1960's it was possible to pack around a thousand transistors into a single chip but the reductions in size asoociated with LSI (Large Scale Integration) meant that by the time Intel produced the first microprocessor in 1971, approximately 70,000 transistors were integrated into it's single chip. The Intel 4004 chip took the IC one step further by combining  all the components of a computer, CPU (Central Processing Unit), memory, and I/O (Input/Output) controls, on a single silicon chip. Previously, ICs had had to be manufactured to fit a special purpose, now one microprocessor could be programmed to meet any number of demands. Companies like IBM, ICL and Sperry Univac continued to supply very large computer systems, called mainframes, to the business world. Mainframes were very powerful computers that allow access for mainly "dumb"terminals, or VDUs (Visual Display Units). However a trend towards down-sizing began, with the introduction of the mini-computer: units like the DEC (Digital Equipment Corporation) PDP (Programmed Data Processor), which sort to bring computing technology to the medium sized business.

By the 1980's, very large scale integration (VLSI) saw hundreds of thousands of components on a chip. Ultra-large scale integration (ULSI) increased that number into the millions. This process helped diminish the size and price of computers and increased their power, efficiency and reliability. It also brought  the power of the computer to the general public, in the form of micro-computers with user-friendly software packages that offered non-technical users applications such as word processing and spreadsheets manipulation. Apple Computers, Commodore and  Radio Shack (Tandy) were pioneers in this field and in the early 1980's, arcade video games such as Pac Man and home video game systems such as the Atari 2600 joined the frey. This development really grabbed public interest and opened the door for more sophisticated, programmable home computers.

1981 saw the advent of the IBM PC (Personal Computer), designed for use in the home, office and school. "Clones", or copies, of the deregulated IBM PC design, made the personal computer even more affordable, allowing the number of personal computers in use to more than double from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used. From 1984 direct competition for the IBM's PC emerged in the form of Apple's Macintosh, designed by Steve Jobs and Steve Wosniak, following work at PARC (Paulo Alto Research Centre) that also saw the development of the laser printer and the GUI (Graphical User Interface). Based on a highly user-friendly concept, the "Mac" featured an operating system that allowed users to control the machine by interacting with icons on the screen, instead of typing instructions. Cursor movement was controlled by another new invention from the PARC "think-tank": the mouse.

Large businesses continued to use individual mainframe or mini-computers with VDU user access but, in contrast, PCs continued their trend toward a smaller size, from desktop to laptop. Everyday household items such as microwave ovens, televisions, video recorders and car fuel injection systems also began to incorporate microprocessor based electronics.

Bill Gates' Microsoft Corporation, originally in conjunction with IBM, then set out to endow the PC with the same user-friendly interface offered by the Apple Mac. His "Windows" software ran on top of the IBM PCs DOS (Disk Operating System) to suddenly provide it with the ability to run a full colour GUI and mouse. This gave the lowly PC a new lease of life and the open approach of Microsoft allowed independant companies to develop applications that took full advantage of the new Windows operating system. In light of the plethera of Windows based Data and Word Processing systems that became widely available, the PC began to  eclipse the Mac.

Small, powerful personal computers could now be linked together using networking technology, yet another PARC invention This allowed PCs software, information and to communicate with each other. Local Area Networks (LANs) typically covering a building, Metropolitan Area Networks (MANs) covering a campus or city and Wide Area Networks (WANs) covering a whole country,  allowed these collaborations to become almost universally accepted within the business world. Many organisations started to rely much less heavily on mainframe access and migrated from "dumb" terminals, to desktop client PCs accessing large central storage servers, in a scenario named Client/Server architecture. 

The adaptation of CD (Compact disk) technology to the storage of large volumes of data, in the form of the CD ROM (Read Only Memory), coupled with advances in the fields of audio and image data compression, also transformed the PC into a multi-media device. This promoted even greater uptake of within the consumer world and many publishers started to see the CD-ROM as a rival for the printed word because of the flexibility of it's audio-visual media.

The trend toward the importance of networking, has reached the situation wheres communication across the world, via a global public WAN called the Internet, is now possible. Electronic mail, or E-mail, which allows users to send and receive messages through networks, is now the most popular use of the Internet. A computer language specially developed to present muilt-media information across a network, called HTML ( Hyper Text Markup Language), now allows text, pictures, sounds and movies to be shared by PCs across the globe on the Internet, from servers hosting World Wide Web (WWW) sites. However, PC miniaturisation has not yet reached an end, this field has also recently seen the development of the palmtop organiser, which is small enough to fit inside a pocket.


HOME