fbpx

Interview: Chris Miller, author of ‘Chip War’

With Chris Miller, Associate Professor of International History at The Fletcher School of Law and Diplomacy

While most commentators think of the microchip as the ‘new oil’, for Chris Miller it’s the defining technology in the balance of global military power. The winner of the 2022 Financial Times and McKinsey Business Book of the Year award explains why.

These days the popular press likes to make the point that computer chips are the ‘new oil’. In fact, so does the jacket blurb for Chris Miller’s new book. And yet, for the author of ‘Chip War’, it’s not a particularly helpful comparison. Not because it’s overblown, but because it doesn’t address the central vulnerability of the humble integrated circuit in the 21st century. While Miller states that modern economies “can’t run without either”, governments need to be aware of the potential tensions created by the industry’s lack of geopolitical dispersion: “A far greater proportion of chips comes out of Taiwan than oil does out of Saudi Arabia.”

Miller, who has just won the Financial Times and McKinsey Business Book of the Year award, is the benefactor of a new trend in the way judges are thinking these days. A decade ago, the laurels were routinely scooped by authors writing on economics, banking, and the rise (or fall) of corporate giants. But then in 2015, Martin Ford took the gong for his ‘The Rise of the Robots’, and ever since the competition has been dominated by expert analyses of big data, Silicon Valley start-ups, cyber crime. Yet for a book so explicitly about technology to come out on top bears witness to just how integrated into the modern global economy the integrated circuit is today. It’s also testament to Miller’s prowess as a storyteller on a theme the Financial Times describes as the “vicissitudes of the chip business, both in the US and in the Asian countries that dominate many parts of the supply chain for a technology that is more indispensable than oil”.

Part of the reason for writing ‘Chip War’ was simply, “we don’t think about the chips, and yet they created the modern world”. The reason we take them for granted, says the historian who specialises in Russian economics, is simple: “we rarely see them.” This doesn’t just hold true for the consumer, only vaguely aware that something microelectronic might be going on in their smartphone. “It’s true for people in the engineering space. If you’re a software engineer, your livelihood will be dependent on the computing power in these chips, but they only rarely get to see the hardware that’s running their software on this level.” It’s also true for Miller who, while writing ‘Chip War’, went from barely knowing what microprocessors were or how they worked, to an elevated state of wonder at their complexity.

Of all the aspects of the integrated circuit that gets taken for granted, “the most striking is Moore’s Law” – a phrase, according to Miller, that everyone’s heard of but no-one outside computer engineering properly understands. “To say that we’ve gotten an exponential increase in the number of transistors per chip understates the transformative impact that Moore’s Law has had.” The idea comes into sharper focus, explains Miller, “when we think of other industrial applications in which we don’t have exponential growth, such as the speed at which aeroplanes fly. In most of the economy, exponentials don’t exist.”

Again, the existence of exponentials in the world of silicon “is something that we know somewhere in the back of our minds, but don’t really think about. It’s the most transformative force not just in technological, but also social and political terms over the past three-quarters of a century.”

To illustrate the point, we sidebar into a discussion of the mathematics underlying Moore’s Law. The potency of its repeated doubling of transistor capacity every two years is highlighted in the ‘wheat and chessboard problem’ dating back 1,000 years, in which a grain of wheat is doubled for every square on a chessboard. Although it is a mathematical problem that may be solved by simple addition, the numbers are staggering. By the time you reach the last square, there are more than 18 quintillion grains of wheat – more than 2,000 times the planet’s annual production.

Back in the world of semiconductors, Miller reminds me “it was not that long ago” we had the Micrologic chip with its two pairs of transistors. Seven decades later, the A16 chip has in the order of 16 billion. This growth comes with a great story, says Miller, who recalls that his original intention was to write the story of the early Cold War arms race. With his background in Russian economics, the question Miller set out to answer was why so many countries including the Soviet Union could initially master key military technologies of the era such as atomic weapons and long-range delivery systems, “but couldn’t do as well in the next phase of precision strikes, such as what was seen in the Gulf War in the early 1990s. I thought it was an interesting puzzle because controlling atomic reactions was an extraordinary challenge in scientific terms, while developing rockets that could fly through space was equally challenging. The Soviets did a good job of both, but then failed horrendously at the next phase, which was miniaturising computing power.”

This failure to scale down gave rise to a joke in the 1980s in which a Communist Party grandee goes to the leader of the Soviet Union and says: “Comrade, comrade, we have made the world’s largest microprocessor.” The more Miller probed into these historical discrepancies and paradoxes, the more he came to realise that “the defining technology of the military balance over the past three-quarters of a century was computing power. And as I got to learn more about electronic miniaturisation, I came to realise that it wasn’t just the military that had been transformed by it: it was all our lives.”

At this point Miller reflects on Russia’s apparent preference for legacy technology over advanced digitalisation in its current military engagement with Ukraine. “It doesn’t surprise me. That’s the way they’ve trained and what they’ve invested in for years. And for certain purposes, really simple munitions can work reasonably well,” especially if “you’ve got a lot of them and don’t have much concern about where they land”. While Russia’s current military tactics are “not surprising in terms of where the trend lines have been heading, what is surprising is that they haven’t upgraded their capabilities more substantially, or upgraded their strategies to take more advantage of technology. But the extent to which the Russians have been far behind in the production of computing capabilities helps to explain why they haven’t made that jump.”

According to Miller, the ascent of computing power over the past century has depended on “the materials associated with how we produce it”. In the early 20th century “the primary source of computing power was the human brain”. By the middle of the century, during the Second World War, “we had vacuum tubes which started producing a reasonable amount of computing power that rivalled brains in terms of quality of output”. Yet post-war developments in microelectronics “made it inevitable” that semiconductor materials would become “a critical source of artificial computing power in the second half of the 20th century and up to the present day. And that set us on a course of path-dependency in which almost all increases in computing power today come from our ability to fashion silicon.”

What this means, says Miller, is that we’ve been in an increasingly symbiotic relationship with the material for the past half-century, to the extent where “today we can’t envision life without it. This is because it’s the material beyond all others that we have the ability to manipulate,” that led to the Information Age, interconnectivity and the Fourth Industrial Revolution.

‘When it comes to chips, globalisation isn’t very global’Chris Miller

It’s obvious when you think about it, says Miller, “but the industry relies on really complex supply chains to make the chips we use. Also, the smaller the chips get the more complex manufacturing becomes. The tools needed to make chips are so sophisticated, expensive and complex that you can’t just turn out more of them at a moment’s notice. It takes years to develop them, which means that capacity increases move slowly.” When the industry comes under pressure due to unexpected demand for computers, smartphones, data centres and so on – the Covid era is an obvious example of what happens when demand outstrips supply – “it has a painful time catching up. Because there are only a small number of companies with capabilities to produce the relevant machine or software tools, ultra-pure materials and so on, the supply chain is not as flexible or resilient as we would like.”

Coming to terms with supply chain issues isn’t just relevant to economic forecasting. It’s vital for understanding political impact, says Miller, because “certain companies are in certain countries, which means that other countries have very little control of what happens in the supply chain”. We’ve all laboured under the impression that “globalisation is global, especially when it comes to chips, because almost everyone in the world comes into contact with semiconductors in some capacity over the course of their daily lives”.

“But the production of chips is decidedly not global,” he adds. “The reason for this is that you need specialisation if you are going to make them from scratch. Then there’s the factor of the huge economies of scale in the industry, which is why there is only a small number of mega-centres for chip production in existence. Then you need ecosystems such as Silicon Valley, Taiwan, Japan and Korea, in which universities can interact with manufacturers. You can only have ecosystems when there are localised rather than spread-out centres of production. Which is why only a few countries play any significant role in their production.”

When it comes to explaining ecosystem scarcity, Miller points the finger at Western governments who “have certainly over recent decades gotten it wrong. I think this has changed a lot over the past few years, which is why you see governments today re-focusing on the industry in a pretty serious way.” In contrast, “I wouldn’t say that the Korean, Taiwanese or Chinese governments have misunderstood the importance of the industry”. Part of this prioritisation is a reaction to the fact that what keeps the manufacturing industry awake at night is the threat of global chip shortages. Meanwhile on the political side of the debate there are two major factors influencing the direction of travel.

First is the growing concern that China might try to attack or blockade Taiwan. “Not that it’s likely to happen imminently. But the risk is there because China’s military is so much stronger today, while America’s position is weaker in relative terms. China’s stated goal of acquiring Taiwan via peaceful or what the Chinese government calls ‘non-peaceful’ means remains. If that were to happen, the effect on chip availability would be dramatic due to Taiwan playing such a critical role in supplying the world.”

Second, Miller says the US government “rightly believes that historically every country that has developed advanced computing capacities has deployed them towards its intelligence and military systems”. Historical examples include how in the Second World War the British learned to crack German codes at Bletchley Park, while the first major orders for semiconductors in the US were for their long-range missile and space programmes.

Miller goes on to explain that today’s defence planners, whether in Washington or Beijing, London or Moscow, “when they envisage the future of war, they’re looking more and more at semi-autonomous or autonomous systems which will be trained in vast data centres. This means that the ability to deploy the most advanced chips or AI algorithms is going to be critical in the training of electronic warfare systems. How to jam signals more effectively. How to fly missiles more accurately.”

For Miller, the idea that ‘globalisation is global’ for semiconductors misstates what is happening in the chip industry. This has “huge implications” for security of supply, while allowing governments “to politicise supply chains in a number of different ways”. One of the consequences of this imbalance is that China – which spends more on chip imports than it does on oil – is now pouring tens of billions of dollars into its catch-up plan to “acquire the world’s most important technology”. As ‘Chip War’ makes clear, what’s at stake is not only the West’s economic prosperity “based on its ability to keep making smartphones”, but also its military superiority.

This is important because “if you look at the past half-century, advances in military capability have primarily been driven by computing, sensing and communications capabilities”. In other words, advances in semiconductors. And that’s likely to remain the case for decades to come, “which means that control over chip-making is going to be a critical ingredient in the future balance of military power”. 

This interview is republished from E&T (Engineering and Technology).

Leave a Reply