Home > News > Biden is worried about Chinese control of ‘strategic technology.’ But which technologies are strategic?
261 views 9 min 0 Comment

Biden is worried about Chinese control of ‘strategic technology.’ But which technologies are strategic?

The administration’s report on “critical technologies” is due Friday.

- June 3, 2021

Earlier this year, the Biden administration made a splash when it announced an executive order requiring the U.S. government to review the nation’s supply chains for critical technologies by June 4. The goal: Ensure that the United States wouldn’t run short of items essential for national security, like semiconductor manufacturing equipment and high-capacity batteries. Then, in March, the administration unveiled a $200 billion research and development project investing in biotechnology, advanced computing and other strategic priorities.

All this suggests a shift from the Trump administration’s “decoupling” approach, in which the goal was to avoid depending on China for anything from steel to T-shirts. Rather, the Biden administration is aiming at what specialists call a “small yard, high fence” approach: securing U.S. interests in a limited and specific set of strategically important technologies.

But what makes some technologies “strategic” rather than others?

The answer affects not only policymakers’ attention but also how billions of dollars are allocated. If the yard is small, prioritizing some technologies will mean discounting or ignoring others. President Bill Clinton’s signature technology policy, the National Nanotechnology Initiative, directed billions into nanotechnology-related research, which some criticized by pointing out that the United States was investing almost two times as much in nanotechnology as in renewable energy. Now priorities have shifted: Renewable energy is in and nanotechnology is out.

In our research, we map out how U.S. thinking about strategic technology has changed over the decades — and offer a framework for reasoning about strategic assets.

The Colonial pipeline shutdown says we’re in a scary new world

What makes a technology ‘strategic’ has changed over time

Understandably, what counts as “strategic technology” changes along with global politics and technology. During the early years of the Cold War, the U.S. government tended to assume that technologies were strategic only if they could be useful in war. This assumption shaped U.S. efforts to restrict exports to the Soviet bloc, with rules targeting items of direct military significance and those that could be used to build military power.

However, in the 1980s, as technological competition with Japan intensified, U.S. policymakers adopted a different definition of strategic technologies. Strategic trade theorists argued that there were first-mover advantages to certain technologies, such as semiconductors and telecommunications. Early success in these technologies would compound to long-term advantage. Alarmed by Japan’s growing dominance in these strategic technologies, U.S. academic, government, and industrial groups jointly published dozens of lists of critical technologies during the late 1980s and early 1990s. This “critical technologies movement” focused on technologies central to U.S. economic competitiveness with Japan, expanding the definition of strategic technologies beyond their direct military utility. For example, launched in 1987, the Semiconductor Manufacturing Technology consortium dedicated hundreds of millions in funding to revitalize the U.S. semiconductor industry, serving as a model for future government-industry partnerships to advance critical technologies.

China and Russia announced a joint pledge to push back on the dollar’s hegemony

The definitions are still changing

Technologies aren’t born strategic. How a technology is perceived depends on the international environment. So which technologies are likely to be considered strategic now?

The United States is neither in the Cold War, when traditional military concerns dominated, nor in a purely economic confrontation with a power like Japan that poses no military threat. Rather, it is worried about China’s emergence as a “near-peer competitor” challenging U.S. economic, military and technological leadership. Unlike either the Soviet Union during the Cold War or Japan during the 1980s, China could overtake the United States both economically and militarily.

In this context, U.S. policymakers are likely to pay particular attention to general purpose technologies that could offer both economic and military advantages, such as artificial intelligence. Indeed, the United States has set up an independent commission on artificial intelligence, which attends to concerns such as these. Nor is it a surprise that, according to senior Biden administration officials, the February executive order for supply chain reviews will focus on U.S. dependence on China in industries such as rare earth elements, which feed into critical commercial and military applications.

Another major change from previous eras of technological competition is the diffusion of digital technologies and globalization of data flows. In the late 1980s, Japan could dominate technologies like consumer electronics and semiconductor memory chips without gaining any influence over its competitors’ information flows. That has changed. As political scientists Henry Farrell and Abraham Newman have pointed out, technological developments have produced centralized data networks controlled by a powerful few. Amazon Web Services and other U.S.-based providers dominate cloud computing and store information on both Americans and non-U.S. citizens. As revealed by the Snowden leaks, the U.S. government exploited centralized information hubs like these for surveillance purposes.

China too recognizes the strategic value of technologies that enable global networks; it is cultivating domestic firms that can compete with the U.S. tech giants in domains such as cloud computing and e-commerce. For example, as the only non-U.S. firm with a substantial share of the global cloud market, Alibaba Cloud is now the market leader in such technologies in Asia. That region could become a critical information hub and control node for the global economy, giving Chinese technology giants — and, potentially, the Chinese government — more power.

Why did France and Britain dispatch their navies to fight over fish?

And they may change again

As the Biden administration evaluates what technologies will be most strategic in the future, these considerations will surely be kept in mind. The administration may also take into account growing concerns about privacy, security, safety and ethics, as reflected in the European Union’s guidelines for “trustworthy AI.” Innovations in trustworthy digital systems, such as techniques to improve an AI systems’ transparency and cloud computing that preserves privacy, could determine which firms lead the global AI and cloud computing markets.

In other words, what counts as “strategic technology” are likely to keep changing, as policymakers must adapt to new international conditions and emerging technologies. The Biden administration’s technology strategy, therefore, will continue to change as well.

Don’t miss any of TMC’s smart analysis! Sign up for our newsletter.

Jeffrey Ding (@jjding99) is a PhD candidate in international relations at the University of Oxford, a pre-doctoral fellow at Stanford’s Center for International Security and Cooperation, and a researcher at the Centre for the Governance of AI at the University of Oxford’s Future of Humanity Institute.

Allan Dafoe (@AllanDafoe) is senior research fellow, associate professor and director of the Centre for the Governance of AI at the University of Oxford’s Future of Humanity Institute.