4.2 The history of computers 4.2.1 Background reading An excellent brief treatment of the history of computers is found in
Wikipedia. Internet resources relating to the history of computing include
http://ei.cs.vt.edu/~history/
The computer that we understand today is usually acknowledged to have
been ‘invented’ during the Second World War (1940s). Both the ENIAC
(Electronic Numerical Integrator and Computer) machine and the Harvard
Mark 1 were developed by teams in the USA in order to undertake the
intensive computations required for the calibration of artillery. At the
same time, in Britain, engineers from the British Post Office developed the
Colossus machine for deciphering intercepted military communications
using electronic technology drawn from telephone exchanges. Of course,
ideas of aiding or automating calculation and information storage are
much older than that and, for example, the abacus (over 4,000 years old)
is still in widespread use today in Asia.
The commercial computer industry started in earnest in the 1950s after
the Second World War. For the first 30 years computers were large,
slow (by today’s standards), and effectively only available to large
organisations. These computers were more or less ‘centralised’ (located
in one place), and data was brought to them, and results (printed on
paper) produced and distributed. Up until the 1970s a chain of shops, for
example, or the branches of a bank, might have a delivery of printed paper
every day or two, and send in stacks of punched cards for processing.
The second 30 years, from about 1980, were and are different. From the
mid-1970s computers became small and smaller still, and communications
networking became cheaper, faster and increasingly, for short distances,
wireless. The combination of these two broad trends brings us to today
where computers are ubiquitous – for example, found everywhere and in
all kinds of devices, and usually networked to other devices and resources.
We are also in the situation where many items have a unique computer
identity and can be tracked and monitored. We even have a name for the
super linked up assembly of technologies that track and identify just about
everything – ‘the internet of things’.
The key technology driving this change over the last 30 years has been
the silicon chip or Very Large Scale Integrated Circuit (VLSI), but this has
been accompanied by a range of other hardware technologies such as
fibre optics for fast digital networks, optical disks for data storage (CDs),
technologies allowing efficient use of the radio spectrum, new battery
technologies, flat screens, etc. And behind each of these developments
stand dedicated technology companies – large and small – who have
driven the pace of development. The most successful companies that drive
forward this market are a range of old established names and newcomers.
They each have their own specialisms in design, manufacture, marketing
etc., and their own business models that allow them to generate revenues
and make profits. Some are very technical, some more marketing based,
and others more service oriented.