fbpx

History offers a guide to winning our growing ‘chip war’ with China

By Chris Miller, Associate Professor of International History at The Fletcher School of Law and Diplomacy

Amid a semiconductor shortage that has caused hundreds of billions of dollars in economic damage, and an accelerating “chip war” between the United States and China over the future of semiconductor technology, the U.S. government is trying to boost support for the semiconductor industry through legislation like the recently passed CHIPS Act.

But for this effort to succeed, it needs to heed the lessons of history. Today’s effort to help chipmakers isn’t the first time governments have devoted money toward semiconductor development. And the history of earlier efforts indicates that focusing on funding scientific research and development, providing a market for speculative technology, and ensuring academics and start-ups have access to funds and manufacturing equipment to test new products will be far more effective than trying to support specific firms or technologies. These are the strategies that propelled the American chip industry in the past, whereas more heavy-handed interventions often produced disappointing results.

The chip industry emerged out of the Cold War arms race in the 1950s, as the Defense Department sought miniaturized computing power for missile guidance computers. Even when it was the biggest customer, however, the U.S. government struggled to predict where commercial and technological trends were headed. Many agencies were more optimistic about an alternative model called “molecular electronics,” which quickly sputtered out, while not seeing the promise in the integration of tiny circuits that led to today’s chips. In 1959, just as small firms like Fairchild Semiconductors and Texas Instruments (TI) were fabricating the first integrated circuits, a U.S. military study visited 15 companies and research labs, including TI and Fairchild, but found no evidence that these two firms were on the brink of pioneering a new industry.

Though the industry’s most crucial early innovations all responded to defense demand, many occurred outside of government-funded programs. Neither of the two engineers credited with simultaneously inventing chips in the late 1950s while working at Fairchild and TI were conducting research on government contracts, for example.

Government procurement helped not by dictating development of specific technologies but by setting priorities — miniaturizing computing power — and making clear that the government was willing to buy almost anything that addressed this need. The first two major orders for chips in the early 1960s were for guidance computers in the Apollo spacecraft and the Minuteman II missile. Unlike civilian customers, NASA and the Pentagon were willing to pay high prices for small-volume production runs, which sped development of the chip industry.

Crucially, NASA and the Pentagon also staged an open competition to procure integrated circuits — one which included both technological giants and start-ups such as Fairchild. The existing electronics firms routinely performed worse, delivering chips behind schedule or not at all. The decision to guide the Apollo spacecraft to the moon using Fairchild’s integrated circuits — an untested product produced by an unknown firm — reflected how the government relied not on heavy-handed policy but rather on clear performance targets, market competition and a willingness to invest a vast budget to build more accurate rockets.

The chip industry grew beyond its start as a niche defense business primarily due to market forces. Start-ups like Fairchild were fixated on bringing their chips to consumer markets because they had no other way to grow. The military’s demand for guidance computers was finite, but consumers’ demand for computing power was already beginning to grow exponentially in the 1960s.

Fairchild founder Robert Noyce had started his career working on a defense contract at Philco — a major radio producer — during which time he concluded that military research contracts stifled the type of innovation needed to develop consumer products. Thus, though its Apollo Program contract helped get Fairchild off the ground, Noyce immediately tacked toward consumer markets. By 1968, 75 percent of chips sold went to produce civilian goods, from corporate computers to hearing aids.

Though chips were invented in the United States, by the late 1970s, Silicon Valley faced new competition from Japanese rivals, sparking calls for government help. Japanese firms like Toshiba and NEC had learned to produce memory chips as advanced as Silicon Valley’s, but with lower prices and far lower defect rates. One study found that Japanese chipmakers averaged one-tenth as many defects as one big American firm.

As U.S. firms lost market share, many analysts credited Japan’s industrial policy for its success. American debate fixated on the Japanese government’s support for corporate research and development (R&D) efforts, like the VLSI Program, which pooled R&D funds from the government and several leading Japanese firms. The total spending on the VLSI program was small — about the same as the R&D budget of a major U.S. chipmaker like TI. Nevertheless, the program loomed large in U.S. thinking and eventually induced the U.S. government to set up a comparable government-backed research consortium called Sematech in 1987.

The government recruited Noyce, who had founded both Fairchild and Intel, to lead Sematech. He focused the organization on supporting U.S. semiconductor manufacturing equipment companies against Japanese rivals. Around half of Sematech’s budget during the late 1980s was directed toward the production of advanced lithography machinery, a crucial type of chipmaking tool that had been pioneered in the United States but by the late 1980s was mostly produced by three firms in Japan and the Netherlands. Noyce saw saving U.S. lithography firms as the primary metric by which Sematech would be judged.

Yet his efforts didn’t prevent the main U.S. firms in the sphere from either going bankrupt or being bought out by foreign rivals because without effective business models and sales capabilities, no amount of government support could rescue them.

Sematech’s other efforts to boost the production of chipmaking tools had mixed results. For example, former executives at Applied Materials, the biggest semiconductor tool manufacturer, argue that Sematech had hardly any impact on their business.

Sematech’s biggest success came in coordinating “road maps,” whereby major chipmakers, tool makers, chip design software firms and other companies that produced products needed to make chips could align their plans to ensure that each new generation of chipmaking technology had the tools and software needed for mass production. This reflected the types of government programs that had the greatest positive impact on the semiconductor industry: not heavy-handed industrial policy but programs marked by public-private partnership to identify technological challenges, followed by an agreement to let private firms find commercially viable ways to address them.

The Pentagon’s Defense Advanced Projects Research Agency (DARPA) offered another example of this approach. Rather than trying to help the commercial industry, DARPA projects provided an opportunity for new ideas to be turned into prototype products, tackling the technical challenges that all chip firms confronted. For example, in the late 1970s, DARPA identified that the increasing complexity of chips would soon make them impossible to design by hand. Having identified this bottleneck, it funded university research programs in automated design processes. Start-ups that spun out of these research programs eventually developed into the three companies that dominate chip design software today.

Similarly, DARPA also realized that the growing cost of chip fabrication was making it more difficult for academics to test new ideas, because the cost of each test chip was increasing. DARPA therefore supported a program to let researchers use commercial chipmaking facilities to fabricate chips, increasing the quantity of research and prototyping. These efforts guaranteed that, even as cost pressures and foreign government subsidies attracted new semiconductor manufacturing facilities offshore, the designs, software and machine tools needed to produce chips are still largely produced in the United States.

As the federal and state governments pour funds into the chip industry anew, this history of industrial policy can serve as a guide for what would be most — and least — effective. Funding workforce development, basic science and pathways for turning ideas into prototypes are all policies that helped build the U.S. chip industry in the past. Heavier-handed efforts to rescue specific firms or to bet on specific types of commercial technology, by contrast, haven’t worked in the past, and won’t help America win the chip war today.

This piece is republished from The Washington Post.

Leave a Reply