*2.1. The Evolution and Expansion of Computing and Connectivity*

The history of computing is believed to have started around 2000 years ago with the Antikythera Mechanism, which was discovered in the sea near Greece in 1901; it is described as a mechanical computer that was used to make astronomical predictions [7]. Computing and calculation machines and methods were also developed in the China, India, and the Islamic worlds prior to, and during, the mediaeval period. It was not until the 1820s, however, that modern computing began in the UK when Charles Babbage developed his mechanical calculation machines-the Difference Engine and the Analytical Engine—with support from Ada Lovelace; she was the first person to recognise that a universal computer could do anything providing that it was given the right data and instructions, and, therefore, she is often referred to as 'the first programmer' [8]. Development continued slowly through the 19th and early 20th centuries with the introduction of electro-mechanical machines and, in 1937, there was a significant conceptual breakthrough when the British mathematician Alan Turing published a paper describing an imaginary machine that performed simple mathematical tasks by following precise logical steps [9]. He continued to develop this concept and machines throughout World War II and, with colleagues at

GCHQ, achieved a significant technical breakthrough in 1942 when thermionic valves (generally used in telephone exchanges) were included in the machines. These components accelerated processing speed considerably and facilitated development of Colossus, the world's first completely programmable, electronic, digital computer, in 1943. Other notable parallel developments include the German Z series developed by Konrad Zuse (who used binary language to produce the first electronic calculator in 1938 and the first highlevel programming language 1943–1945) and ENIAC (Electronic Numerical Integrator and Computer), which was developed and launched, in the USA, in 1945 for military use. It is worth noting that neither the German or British governments recognised the importance of, nor funded, Turing or Zuse's research at the beginning of WWII and, consequently, the antecedents to and early Colossus computers were made from recycled components discarded by telephone exchanges for example. Nevertheless, the various technical and programming developments continued at pace after WWII, and numerous computers were developed for industrial, commercial, and military applications. Since then, computing technology has developed ever more rapidly; for example, keyboard input capability and transistor-based technologies were introduced in the late 1950s. As the evolution of mainframe computing continued, other developments enabled the introduction of personal computing. Examples include the invention of integrated circuits, silicon-based transistors in the late 1950s, the first single-person operated computer (1962), the first microprocessor (1973), the first Apple 1 and 2 computers (1977), the first IBM personal computer (1982), the first Apple Macintosh (1984), and the first portable computers in the early 1990s. The first computer game was developed in 1961 at MIT, after which various dedicated affordable computers were developed and sold from 1979; computer games were important because they helped to introduce the wider public to and popularise computers, which ensured their place outside the workplace and in the home [10,11].

In 1965 Gordon Moore (an engineer and founder of Intel) predicted that, as density increased, the number of transistors that would fit on a computer chip would double every year—i.e., Moore's Law. Although this was amended to two years in 1975, this particular technical development, in conjunction with the development of new computer languages and software programmes, further accelerated the power and speed of computing; the development of data storage media, such as floppy discs and CDs, also increased functionality in the workplace and at home.

The earliest computers were stand-alone machines but gradually, research activity in the USA, France and UK facilitated the development of internal networks to which they were connected for data exchange. External networks were also developed and initially used in business and academia with the launch of a global Joint Academic Network (JANET) in 1984 to enable rapid information exchange and collaboration. During the 1980s, the British engineer and computer scientist Sir Tim Berners-Lee also developed a new digital information and communication language and network, which subsequently evolved to become the World Wide Web in 1989. Since then, the user group has expanded from 'geeks,' researchers, and academics to the general public, and in January 2021, over 4.66 billion people and 60% of the global population were 'connected' via the internet; of these individuals, 97% own a smart phone, 64% a laptop or desktop PC, and 34% a tablet [5].
