The True Dangers of Trump’s Economic Plans
His Radical Agenda Would Wreak Havoc on American Businesses, Workers, and Consumers
In remarks in 2018, Chinese leader Xi Jinping highlighted the potential of “disruptive technological innovation” to change history. Key advancements, Xi insisted, had remade the world. He listed the “mechanization” of the First Industrial Revolution, the “electrification” of the Second Industrial Revolution, and the “informatization” of the Third Industrial Revolution. Now, Xi said, breakthroughs in cutting-edge technologies such as artificial intelligence had brought the world to the cusp of a Fourth Industrial Revolution. Those who pioneered the new technologies would be the winners of the era to come.
In the following months, Chinese analysts and scholars expounded on Xi’s speech, unpacking the connection between technological disruption and geopolitics. One commentary in an official Chinese Communist Party publication detailed the consequences of past technological revolutions: “Britain seized the opportunity of the first industrial revolution and established a world-leading productivity advantage. . . . After the second industrial revolution, the United States seized the dominance of advanced productivity from Britain.” In his analysis of Xi’s remarks, Jin Canrong, an influential Chinese international relations scholar, argued that China has a better chance than the United States of triumphing in the competition over the Fourth Industrial Revolution.
Chinese analysts are not alone in thinking this way about technological innovation and power. U.S. policymakers also see a vital link. In his first press conference after taking office, President Joe Biden underscored the need to “own the future” as it relates to competition in emerging technologies, pledging that China’s goal to become “the most powerful country in the world” was “not going to happen on my watch.” In 2018, Congress set up the National Security Commission on Artificial Intelligence, a body that convened government officials, technology experts, and social scientists to study the implications of AI. Comparing AI’s possible impact to earlier innovations such as electricity, the commission’s final report warned that the United States would soon lose its technological leadership to China if it did not adequately prepare for the “AI revolution.”
In their obsession to win the future, both Chinese and American leaders risk overlooking a key truth about technology and transformation. They worry about dominating critical technological innovations in new, fast-growing industries, believing that the global balance of economic power tips toward the states that pioneer the most important innovations. In this view, the United Kingdom in the nineteenth century became the world’s most productive economy because it was home to new advances that transformed its burgeoning textile industry, such as the spinning jenny.
But innovation only gets you so far. Without the humbler undertaking of diffusion—how innovations spread and are adopted—even the most extraordinary advances will not matter. A country’s ability to embrace technologies at scale is especially important for technologies such as electricity and AI, foundational advances that boost productivity only after many sectors of the economy begin to use them. A focus on the diffusion of technology points toward an alternative explanation for how technological revolutions change geopolitics: it matters less which country first introduces a major innovation and more which countries adopt and spread those innovations.
The United Kingdom’s rise in the wake of the First Industrial Revolution, which lasted from roughly 1780 to 1840, is often held up as the prime example of how technological breakthroughs can lead to geopolitical supremacy. Conventional accounts tend to attribute the country’s rise to its monopoly over innovation in cotton textiles and other leading sectors. British technological leadership, according to this view, sprang from the institutional capacity to nurture genius inventors.
But with improved data and methodologies, economic historians have challenged this prevailing narrative. They argue that the adoption of iron machinery across a wide range of economic activities proved more central to its economic rise than the pioneering of new technologies in textiles, for instance. Although its industrial rivals boasted superior systems of higher technical education for training expert scientists and engineers, the United Kingdom benefited from mechanics’ institutes, educational centers such as the Manchester College of Arts and Sciences, and other associations that expanded access to technical literacy and applied mechanics knowledge to a broader segment of society.
The diffusion of technology also defined how countries benefited from the Second Industrial Revolution, which began around 1870 and ended around 1914. The Second Industrial Revolution was spurred by inventions in machine tools—the industrial production of interchangeable parts. During this period, the United States did not produce the world’s most sophisticated machinery, but it surpassed the United Kingdom in productivity by adapting machine tools across almost all branches of industry. In 1907, machine intensity (which measures the horsepower of installed machines per manufacturing worker) in the United States was more than double that of the United Kingdom and Germany. As in the earlier British example, education and public policy played a major role in securing the U.S. advantage. The United States had a wide pool of mechanical engineering expertise, supported by land-grant schools, technical institutes, and standardization efforts in screw threads and other machine components. These institutions broadened the base of expertise, creating more competent engineers and not simply producing a narrow technical elite. Similar dynamics prevailed in chemical engineering, where U.S. institutions of higher education helped cultivate a common language and a professional community of chemical engineers who could help speed productivity in a wide range of industries, including ceramics, food processing, glass, metallurgy, and petroleum refining.
Who will lead the way in the Fourth Industrial Revolution? Preoccupied with monopolizing innovations, thinkers and policymakers in both the United States and China place undue emphasis on three points: how quickly AI and other emerging technologies will shape productivity growth; where the fundamental advances are first pioneered; and the expectation that a narrow range of industries will drive growth by harnessing new technologies. They neglect the real determining factor in this competition: a country’s capacity to diffuse AI advances across a wide range of industries, in a gradual process that will likely play out over decades.
When great-power competition over AI is reframed in this way, the United States seems well positioned to maintain its technological edge. American businesses have been much quicker to embrace other information and communication technologies, such as cloud computing, smart sensors, and key industrial software. On one influential index, China ranks 83rd in the world in terms of access to these technologies, trailing the United States by 67 places. When it comes to AI, China has only 29 universities that employ at least one researcher who has published at least one paper in a leading AI conference publication (a rough gauge for whether a university can train AI engineers); the United States is home to 159. The United States has also built tight links between academia and industry that help disseminate AI advances across the entire economy—far more than China has.
The United States should prioritize improving and sustaining the rate at which AI becomes embedded in a wide range of productive processes.
But instead of sustaining its advantages in the diffusion of AI, the United States is fixated on dominating innovation cycles in leading sectors. U.S. policymakers are engrossed in ensuring that cutting-edge innovations do not leak to China, whether by denying visas to Chinese graduate students in advanced technical fields or by imposing export controls on high-end chips for training large models. Previous industrial revolutions have demonstrated that no one country can monopolize foundational innovations, so it will be unfeasible for the United States to cut China off from AI.
The United States should instead prioritize improving and sustaining the rate at which AI becomes embedded in a wide range of productive processes. Washington should focus on policies directed at broadening the talent pool, such as providing community colleges with greater backing to better train an AI-savvy workforce and fully implementing the CHIPS and Science Act’s workforce initiatives that expand training in STEM fields. Investments in applied technology centers, which bridge the gap between basic research operations and industrial needs by providing testing services and conducting applied R & D; dedicated field services, such as the Manufacturing Extension Partnership, which hosts experienced specialists who help businesses incorporate new technologies and diversify their markets; and other technology diffusion institutions can encourage the adoption of AI techniques by small and medium-sized enterprises.
To be clear, understanding the importance of diffusion does not exclude supporting the exciting research in a country’s leading labs and universities. Undoubtedly, more R & D spending and better facilities for elite scientists will also indirectly contribute to more widespread adoption of AI. All too often, however, increased R & D spending becomes the boilerplate recommendation for any strategic technology. AI demands a different toolkit.
When some of the leading thinkers of the era declare that the AI revolution will be more significant than earlier industrial revolutions, it is easy to get caught up in their excitement. Many people in every generation wind up believing that their lives coincide with a uniquely important period in history. But the present moment might not be so unprecedented. Previous industrial revolutions suggest that real success in the age of AI will come to those countries that best position their populations and industries to embrace new technologies—not simply invent them.