–New analysis helps predict which new systems will be on a fast track to improvements in performance. |
Some forms of technology — think, for example, of computer chips — are on a fast track to constant improvements, while others evolve much more slowly. Now, a new study by researchers at MIT and other institutions shows that it may be possible to predict which technologies are likeliest to advance rapidly, and therefore may be worth more investment in research and resources.
In a nutshell, the researchers found that the greater a technology’s complexity, the more slowly it changes and improves over time. They devised a way of mathematically modeling complexity, breaking a system down into its individual components and then mapping all the interconnections between these components.
“It gives you a way to think about how the structure of the technology affects the rate of improvement,” says Jessika Trancik, assistant professor of engineering systems at MIT and co-author of a paper explaining the findings. The paper’s lead author is James McNerney, a graduate student at Boston University (BU); other co-authors are Santa Fe Institute Professor Doyne Farmer and BU physics professor Sid Redner. It appears online this week in the Proceedings of the National Academy of Sciences.
The team was inspired by the complexity of energy-related technologies ranging from tiny transistors to huge coal-fired powerplants. They have tracked how these technologies improve over time, either through reduced cost or better performance, and, in this paper, develop a model to compare that progress to the complexity of the design and the degree of connectivity among its different components.
The authors say the approach they devised for comparing technologies could, for example, help policymakers mitigate climate change: By predicting which low-carbon technologies are likeliest to improve rapidly, their strategy could help identify the most effective areas to concentrate research funding. The analysis makes it possible to pick technologies “not just so they will work well today, but ones that will be subject to rapid development in the future,” Trancik says.
Besides the importance of overall design complexity in slowing the rate of improvement, the researchers also found that certain patterns of interconnection can create bottlenecks, causing the pace of improvements to come in fits and starts rather than at a steady rate.
“In this paper, we develop a theory that shows why we see the rates of improvement that we see,” Trancik says. Now that they have developed the theory, she and her colleagues are moving on to do empirical analysis of many different technologies to gauge how effective the model is in practice. “We’re doing a lot of work on analyzing large data sets” on different products and processes, she says.
For now, she suggests, the method is most useful for comparing two different technologies “whose components are similar, but whose design complexity is different.” For example, the analysis could be used to compare different approaches to next-generation solar photovoltaic cells, she says. The method can also be applied to processes, such as improving the design of supply chains or infrastructure systems. “It can be applied at many different scales,” she says.
Ultimately, the kind of analysis developed in this paper could become part of the design process — allowing engineers to “design for rapid innovation,” Trancik says, by using these principles to determine “how you set up the architecture of your system.”