image courtesy of stable diffusion (image prompt: “history of computing as a story of time and space in the style of dali”)
We were asked to write the ‘computing and cloud computing’ chapter of an upcoming OUP book on FinTech law, and I was asked to write the ‘history of computing’ section (with a 2,000 word limit!).
Here’s what I put together, not perfect, but I hope helps someone orientate themselves.
History of computing
The modern information age began on 15 November 1971 with the announcement by Intel of the ‘4004’ general purpose microprocessor, under the helm of Andy Grove. Seven years earlier, another Intel founder, Gordon Moore, put forth a ‘law’ that the density of transistors on an integrated circuit doubles every year. This timeframe was later amended to every two years.
There are three remarkable features of Moore’s Law. The first is that it has held true for 50 years, from 2300 transistors on the ‘4004’ in 1971 through to 50 million or so in a modern CPU. The second is that the rate of increase is logarithmic, increasing at an ever-growing compounding rate. The third is that it is not at all a ‘law’ – there is nothing written in the stars that processing technology must progress at this rate. Moore’s ‘Law’ is a prediction and a target, the culmination of marginal gains of human ingenuity and insight arising from the dogged pursuit of advancing calculation power; a testament to human determination and target fixation, which has brought humanity millions-fold improvements in computational power.
Moore’s ‘Law’ is a prediction and a target, the culmination of marginal gains of human ingenuity and insight arising from the dogged pursuit of advancing calculation power; a testament to human determination and target fixation, which has brought humanity millions-fold improvements in computational power.
The rest of this section will address what humans have done with this dividend of processing power. There are three main themes, two technological and one a business model. The first theme encompasses technologies that allow the aggregation of processing between different locations – be it network connectivity, the internet, multi-core computing, or parallel processing. These are all ways for groups of processors to compute more than they could individually. The second theme is increasing levels of abstraction: more and more levels of software and framework, taking the user and developer further and further from the transistors and memory blocks, allowing them to stand on the shoulders of giants, and achieve more with less. These advancements include operating systems (in particular Linux), programming languages, database products, APIs and open source software. The third theme comprises the business model of renting computing power only when it is needed (‘cloud computing’) and the swift expansion of that service from pure processing-as-a-service into everything-as-a-service – with all the technology advancements set out above available on a ‘pay-per-second’ basis with no set up costs, and fees accruing only when you are using it.
1.1. The classic history
In general, two simultaneous changes followed from the increased processing power: putting more transistors into the same large data centres; or putting the transistors into smaller and more portable packages.
Initially computers had a large capital cost, required specialist skills to operate, and had limited availability. They were only available in elite environments such as universities and research labs, and access was time-allocated – people would prepare their punch cards of processing instructions in advance and load them into the computer during their pre-booked slot.
This ‘mainframe’ era of computing saw enormous advances and improvements in processing power, complexity, availability, and stability, yet the ‘mainframe’ remained a destination – a place where humans brought themselves to the computer to do computing tasks. Computing was not available where consumers were, largely because useful computers were too large to be in the home.
The next era of modern computing was the ‘desktop’ era. This saw the cost, size, stability, and user interfaces of computers improve sufficiently, along with the standardisation of components, to put ‘a computer in every home’. Computing became available to business executives, researchers, affluent (and then almost all) western families, parents and children. Computers were used as tools and toys, mostly by individual users, but sometimes for small groups in one location. Computing was now available where consumers were located, but for most of this era there were not adequate network communications to allow regular and abundant interactions between computers.
The ‘mobile’ computing era’s totemic launch came with Steve Jobs’ iPhone announcement in January 2007. Jobs announced the bringing together into one package of the iPod, a cellular phone, and a computer, along with a full phone touchscreen. Each of these became hallmarks of the ‘mobile’ era of computing. This era saw the promise of a computer in everyone’s pocket achieved. The combination of the screen and input methods enabled the screen to be both the input and the display method, allowing it to mimic endless tools, games, websites and widgets. The more powerful development, however, was the subsequent release of an ‘App Store’ of sandboxed apps, allowing the low development cost, fast distribution, in-built charging model, and therefore the mass proliferation, of mobile apps.
1.2. Another perspective
‘Mainframe, desktop, mobile’ is the typical formulation of computing eras in the modern information age and must be understood. It is all focussed on where the computing takes place however.
Another model is to consider the changes in when computing was done through these various stages. In the mainframe era, people performed ‘batch’ computing – preparing their materials, performing their computing, and then returning to their places of work. In the desktop era, people performed deliberate computing – they sat down at their computer to do computing tasks (which of course still happens today). In the mobile era, the real advancement provided by the ‘app’ model was not the flexible location of computing, but that networks were fast and reliable enough, that ‘mainframes’ (now data centres and cloud services, more on this below) were cheap and available enough, and software stacks commoditised enough. Computing became continuous: a continuous stream of on-desktop and mobile inputs (with those edges to the network doing lightweight computing), working together with mainframe-based power-computing which interacts with other users, computers or systems, with the whole arrangement creating an indistinguishable stream of computing activity, whether the user is engaged with an interface or not.
While the desktop-to-mobile transition does not, in itself, need more than ‘just’ smaller processors, touchscreens, batteries and a suitable OS, the transition from deliberate to continuous computing has additional pre-requisites.
Continuous computing means the processing is split between various processing environments, running on various operating systems. This means that rapid data networks, secure communications, identification systems, and trusted computing environments are also needed. These are all complex problems, each with their own industry histories. By the time of widespread mobile computing, these elements however were all either priced as a commodity (e.g. network connectivity), or carried no or minimal direct cost (e.g. public key encryption available through open source software, or access to the iOS App Store for hundreds of dollars).
Another requirement is a central processing computer – a mainframe – which traditionally carried a large upfront cost of buying and operating a mainframe to run the background processing operations. Data centres began a business model innovation of renting computing power on demand, which had the effect of decoupling the large capital costs of owning a mainframe (hardware purchase and replacement, rent, physical security, power, telecoms, cooling, etc.), from the ability to use and access a mainframe. By providing computer power in a scalable manner, without up-front costs (infrastructure-as-a-service, “IaaS”), mainframe suppliers managed in a few short years to abstract an entire industry into a set of APIs and commoditised functions, giving birth to ‘the cloud’.
Further investment and development of this service allowed for other mainframe upfront costs to be avoided: setting up, maintaining, and supporting the operating systems; and ‘basic hygiene’ of a modern computing environment – allowing work on differentiating products and features to begin much earlier. These were the core developments that allowed for the operation of financial services in a ‘cloud’ environment.
Since those building blocks for financial services computing in the cloud environment were put in place (2006 – 2008), most of the changes have been incremental, but compounding. More and more computing operations, operating systems, software suites, databases, and cutting edge analysing and processing such as AI, are available ‘as-a-service’ in what is now sometimes dubbed ‘everything-as-a-service’ – meaning developers can work at higher and higher levels of abstraction away from the core infrastructure – doing more with less, and commoditising parts of software development and software operations.
The increased importance of cloud computing has placed greater focus on security and resilience against downtime, data loss, data manipulation, data leakage, errors, privilege escalations, etc. – leading to increased security processes in cloud environments, simpler arrangements to implement those processes, and automated monitoring, such as intrusion detection. As desktops, laptops, tablets and phones became more powerful, that local processing power was increasingly used to deliver rich user experiences, and in-built hardware security features (such as fingerprint and face scanning) were deployed to help authenticate users and securely communicate that authentication to the data centre. These are all iterative in core technology terms (as compared to the pre-requisites set out above) – but they were necessary steps for the affordable, mass-market deployment of many of the products and services we associate with ‘cloud-based financial services’.
1.3. What next?
The range and complexity of software services available via the cloud will continue to grow, as the main suppliers compete. For the purposes of ‘FinTech in the cloud’ we are likely to see iterative and incremental steps to improve the service.
There are two main disruptive threats to this cloud computing model. The first is truly distributed processing, in which financial services are established that don’t use a centralised organisation, and the need to operate in a transparent way that removes all and any authority to change the software code, or how it is run. This requires a different type of computing environment, a problem solved today by having redundant duplicative computing and verification take place, with a consensus established as to the correct answer (e.g. via Ethereum) – but perhaps with other solutions in the future. It is, however, hard to see how a centralised cloud offering, as we currently understand it, could perform the role of a decentralised computing network in this way.
The second disruptive force is quantum computing which, if established, would offer a very significant increase in processing power, far beyond the pace of Moore’s Law. This would have the possibility of providing an abundance of processing power which unlocks categorical changes in how cloud computing is priced or what it can be used for, but would take time to develop, productise, and turn into a commoditised cloud-based offering, and for developer users to understand and build on the new algorithm types and capabilities.
1.4. Summary
Computing over the last 50 years has become faster and smaller, but also cheaper and more commoditised. A mesh of technologies has changed the experience of computing from a serious academic pursuit, to the background vibrations of a busy life. The builders of these technologies have become the largest and most powerful companies in the world, yet even their growth is eclipsed by the advances in the technology itself. The culmination of the emergence of a common platform and set of standards is that it allows software, businesses, banks, messaging platforms, payment systems, and more to be deployed globally, instantly, securely, on entirely rented computing power.