Digital Economy Dispatch #104 -- Chip Wars and the Next Stage in the Digital Revolution

Digital Economy Dispatch #104 -- Chip Wars and the Next Stage in the Digital Revolution6th November 2022

In 1995 I moved from Pittsburgh to Dallas to join Texas Instruments. By then I had studied for a PhD in computer science, developed and delivered commercial software systems, taught computer science to undergraduates, and worked at one of the foremost software engineering research institutions in the world. I thought I had it covered. Yet for the first few months of this new role, I was totally and utterly lost.

Although I had been hired to lead a software research team to build a new object-oriented database infrastructure at TI, I found that all employees must immediately take part in a 4-week induction process involving lectures, project work, and exams. It focused on the fundamentals of the computer stack: Sand, silicon, and photolithography. Within hours, it became clear to me that I knew almost nothing about how computers worked.

Sure, I had a basic grounding in the general architecture of the computer. And with a good headwind I could bumble my way through the basics of circuit design, logic gates, and the layout of a CPU. But my knowledge of chip design, the murky world of how semiconductors are embedded in circuity on silicon wafers, was severely lacking. I was entering a completely different set of conversations describing conductive materials, chemical properties, and precision manufacturing. It was clear that I was way outside my comfort zone.

On reflection, I now recognize that in those 4 weeks at TI I learned some of the most important lessons of my career. My eyes were opened to a new world that sits way below the surface of what the vast majority understand as the digital transformation experience. It is now becoming clear that knowing more about such aspects of computing is essential to gain insight into the economic and political reality of the digital economy.

From Tiny Acorns

The rapid advance in the scale, performance, and price of computer hardware is perhaps the most important technological miracle of the last 50 years. Even seen in broad terms, the room-sized computers of the 1950s capable of hundreds of calculations per second have evolved within just a few decades to today’s pocket-sized machines crammed with multiple processing units performing billions of instructions per second with high quality, minimal power consumption, and available at a price that makes them affordable to almost everyone. It is the computing chips that provide the computation engines at the heart of this revolution

The numbers make your head spin. One of the latest commercial CPU offerings from Intel, the i9 desktop processor, contains up to 24 processing cores, runs at 4.5 GHz, and executes hundreds of millions of instructions per second. All for the price of a few hundred dollars. To achieve this, the chip on which it is embedded contain billions of transistors. This means that the components are cammed together so tightly that they now must manufacture them to tolerances of less than 10 nanometers. For context, remember that a nanometer is 1/1,000,000,000 of a meter: A single hair is about 100,000 nanometers thick, and a strand of human DNA is about 3 nanometers wide!

Chips with Everything

Why does this matter? The digital revolution is fundamentally grounded in the availability of cheap, high quality computing capability. Without a ready supply of chips, this would be impossible. Something we experienced only too recently. During 2020 and 2021, interruptions to the supply due to the impact of Covid and extreme weather conditions led to problems across a very wide set of application domains. Delays and cancellations became commonplace as chip shortages caused car factory closures, laptop scarcities for remote office workers, and long supply chain delays for manufacturers of every kind of smart product from toasters to thermostats.

One area of particular intensity for computing technology adoption is within the massive compute and data storage warehouses that power the cloud. These are growing at a rapid rate. According to IBM, cloud computing revenues reached $219 billion in 2020, and analysts expect the industry to further grow to $791 billion by 2028. Yet despite record spending on cloud technologies over the past 18 months, chip shortages are a major threat to their ability to support the wide variety of new applications being developed. Add to this the rapid deployment of CPU-hungry Machine Learning (ML) and AI applications, and you have basis for a major issue. 

Facing these challenges has brought an unexpected silver lining. Due to recent shortages and high demand for supporting “smart” applications, we are now increasingly asking questions about how the crucial hardware powering digital transformation is made, where it is manufactured, and who controls the supply chain for them. This thinking has caused many to recognize that control of the manufacture and distribution of the latest chips has become a major political concern.

Chip Wars

As the world becomes more deeply engaged in the adoption of digital technologies, attention has turned to who produces the most advanced chips at the core of the systems that now are essential to how we function: commerce, infrastructure, defence, government, and so on. Naturally, the super powers of the USA and China are at the forefront of these concerns and  have placed technology investments at the top of their agendas. So much so that President Xi Jinping recently declared that “Technological innovation has become the main battleground of the global playing field, and competition for tech dominance will grow unprecedentedly fierce.” 

Such a focus on high-tech developments in China have elicited a predictable reaction from the US.  Initially, through the introduction of restrictions on chip technologies and knowhow to key Chinese companies such as Huawei, actions have escalated. Now, the US has introduced much broader export controls aimed at blocking Chinese companies from developing their advanced systems. It is a tit-for-tat approach with severe consequences for digital transformation around the world. Unfortunately, such actions may have knock on effects by disrupting digital technology development and slowing down its deployment in many application domains. However, it is also driving investments in new chip production facilities to build nationally- controlled supply chains.

That is the aim of the recent Produce Semiconductors for America Act, the CHIPS Act. Over $50 Billion of investment is promised to reduce vulnerabilities in the chip supply chain in the US. Almost three quarters will go to new fabrication plants for commercial and military grade chips. The rest is aimed at advancing the research and development needed to move this technology forward.

Many see this as an aggressive move from the US government aimed at regaining its worldwide dominance in this critical area. Chris Miller takes this argument much further. In his book “Chip War”, he looks at the history and key milestones in computer hardware development to reach the conclusion that the chip industry now determines both the structure of the global economy and the balance of geopolitical power. Consequently, the pace and position assumed by the US and China in their digital transformation journey is more than a play to control the hardware powering key business domains such as media, advertising, aerospace, and retail. It is at the core of how we can make sense of their key relationships among countries such as South Korea, Taiwan, India, and Russia.

Miller sees these chip wars as a defining moment in all our digital futures. He makes it clear that the multi-billion dollar struggle for semiconductor supremacy will intensify in an increasingly digitized world. It may well require redefinition of the complex supply chains of hardware and software that form the backbone of all digital solutions, and it will undoubtedly be a significant lever in who controls the future of digital transformation.

The Long and Winding Road

Digital transformation and the future of our digital society are much more than technological sideshows. They are at the heart of the current worldwide geopolitical struggle. Below the surface of discussions on the application of AI, Machine Learning, and Machine Intelligence, attention is now focused on the production capabilities and supply chains for advanced chips. These power the computing and storage devices performing the heavy lifting for smart applications in the cloud. Understanding more about their history, design, and capabilities will help us all make more sense of the next stage in the digital revolution.