What the Typical Rate of Improvement in Various Technologies Means for the Future—Christopher Mims

Christopher Mims is one of my favorite technology writers. He reported on a fascinating paper by Anuraag Sing, Giorgio Triulzi and Christopher Magee, “Technological improvement rate predictions for all technologies: Use of patent data and an extended domain description” a few weeks ago. (This paper is ungated.)

It turns out that different detailed areas of technology have had very different rates of technological progress in the recent past, which provide useful information for predicting their rates of progress in the future. Quoting from his September 18, 2021 Wall Street Journal article, “New Research Busts Popular Myths About Innovation,” here are some of the tidbits that Christopher Mims extracts from the paper and related research (bullets added to separate passages):

  • Robotics, for example, is improving at the rate of 18.5% a year, which sounds like a lot, except that the average rate of improvement for the more than 1,700 technologies the researchers studied is 19% a year.

  • … the MIT researchers have found through the patent literature that a principal driver of the steady shrinking of microchip circuitry has been improvements in laser technology.

  • In research yet to be published, Dr. Farmer and other members of his group compared the rates of improvement in solar photovoltaic technology and nuclear power, and found that while the cost per watt of solar power is now 0.1% what it was 70 years ago, the cost of nuclear power actually went up.

    “So if you’re talking about the future, it isn’t nuclear; and if you’re an investor, you should know that, and if you’re a student, becoming a nuclear engineer isn’t something I would recommend to anybody,” says Dr. Farmer.

The paper's method is to fit rates of productivity growth in 30 technological domains to details of patent data as the right-hand-side variables and then extrapolate that function of patent data to many, many more technological domains. As Anuraag Sing, Giorgio Triulzi and Christopher Magee write:

As shown in Benson and Magee (2015b) and, more recently, by Triulzi et al. (2020), once a patent set for a technology domain has been identified, it is possible to estimate the yearly rate of performance improvement for that domain. In these two papers the authors tested several different patent-based measures as predictors of the yearly performance improvement rate for 30 different technologies for which observed performance time series were available. By far, the most accurate and reliable indicator is a measure of the centrality of a technology's patents in the overall US patent citation network, as shown in Triulzi et al. (2020). More precisely, technologies whose patents cite very central patents tend to also have faster improvement rates, possibly as a result of enjoying more spillovers from advances in other technologies and/or because of a wider use of fast improving technologies by other technologies, proxied by patent citations.

This should all be taken with a grain of salt, but it provides interesting predictions for the rate of progress in finely-sliced (“granular”) technological domains that will be testable in the future—say by choosing a random sample to do a detailed study of productivity for.

Some of the key tables in the paper give very interesting detail. First, how narrow the domains are is clear from this list of the ten predicted to have the highest rate of technological improvement:

These ten all seem in fairly closely related domains. The list of the twenty predicted to have the slowest rate of technological improvement is more variegated:

The table that gives the most comprehensive sense is that for the 50 biggest domains. Unfortunately, it is ordered by size of domain rather than by growth rate, but there is a lot of useful information in it:

The claim that rates of progress in particular domains are fairly constant raises the issue of why there are noticeable technology shocks at the macro level. Here is how I see things:

  • Technologies improve in narrow domains at different rates; several narrow domains can often be reasonably close substitutes.

  • At some point one technology overtakes a status-quo technology by enough that the insurgent technology goes through an S-shaped logistic curve to widespread adoption.

  • Because of transition costs, the insurgent technology has to be significantly better at that point.

  • The technology shock seen in aggregate data corresponds to the steep part of the S-curve, when adoptions are happening fast (or rather, completions of the transition due to adoptions are happening fast).

  • The exact time when an insurgent technology will overtake a status-quo technology could be predicted much better than it is by macroeconomists—after all, there is warning in the early-adoption part of the S-curve before the steep part of the S-curve.

  • Though there is little doubt that the method will be debated, papers like this are are start toward what we need in order to better predict macroeconomic technology shocks.

Christopher Mims’s Wall Street Journal article has a hint of the kind of thing I am arguing. Christopher writes:

Bill Buxton, a researcher at Microsoft Research and one of the creators of the interface on which modern touch computing is based, articulated in 2008 a theory that distills some of the insights of this research into a simple concept. He calls it the “long nose of innovation,” and it describes a graph plotting the rate of improvement, and often adoption, of a technology: a period of apparently negligible gains, followed by exponential growth.