Q&A: Neil Thompson on Computing and Innovation | MIT News

Moore’s Law is the famous prediction by Intel co-founder Gordon Moore that the number of transistors on a microchip would double every year or two. This prediction has largely been fulfilled or surpassed since the 1970s – computing power doubles approximately every two years, while better and faster microchips get cheaper.

This rapid growth in computing power has fueled innovation for decades, but in the early 21st century, researchers started ringing alarm bells that Moore’s Law was slowing down. With standard silicon technology, there are physical limits to how tiny transistors can become and how many can be squeezed onto an affordable microchip.

Neil Thompson, an MIT researcher at the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Sloan School of Management, and his research team wanted to quantify the importance of more powerful computers for improving outcomes in society. In a new working paper, they analyzed five areas where computation is critical, including weather forecasting, oil exploration and protein folding (important for drug discovery). The working paper is co-authored by research assistants Gabriel F. Manso and Shuning Ge.

They found that between 49 and 94 percent of the improvements in these areas can be explained by computing power. In weather forecasting, for example, increasing computing power by a factor of 10 improves three-day forecasts by a third of a degree.

But computer progress is slowing, which could have far-reaching consequences for the economy and society. Thompson spoke to MIT News about this research and the implications of the end of Moore’s law.

Q: How did you approach this analysis and quantify the impact of computing across domains?

A: Quantifying the impact of computing on real results is difficult. The most common way to look at computing power and IT progress in general is to study how much companies spend on it and see how that compares to results. But spend is a difficult metric to use because it only partially reflects the value of the computing power purchased. For example, today’s computer chip may cost the same as last year’s, but it is also much more powerful. Economists are trying to adapt to that change in quality, but it’s hard to pinpoint exactly what that number should be. For our project, we measured computational power more directly, for example by looking at the capabilities of the systems used when protein folding was first done using deep learning. By looking directly at possibilities, we can get more accurate measurements and thus get better estimates of how compute power affects performance.

Q: How do more powerful computers improve weather forecasting, oil exploration and protein folding?

A: The short answer is that increases in computing power have had a huge effect on these areas. With weather forecasting, we found that there has been a trillion fold increase in the amount of computing power used for these models. That puts into perspective how much computing power has increased, and also how we have used it. This isn’t someone who just takes an old program and puts it on a faster computer; instead, users have to constantly redesign their algorithms to take advantage of 10 or 100 times more computing power. It still takes a lot of human ingenuity to improve performance, but what our results show is that much of that ingenuity is focused on harnessing increasingly powerful computer engines.

Oil exploration is an interesting case because it gets harder over time as the easy wells are drilled, so what’s left is harder. Oil companies are fighting that trend with some of the largest supercomputers in the world, using them to interpret seismic data and map underground geology. This helps them to drill better in exactly the right place.

Using computing to better fold proteins has long been a goal, as it is crucial to understanding the three-dimensional shapes of these molecules, which in turn determine how they interact with other molecules. The AlphaFold systems have achieved remarkable breakthroughs in this area in recent years. What our analysis shows is that these improvements are well predicted by the massive increase in computing power they use.

Q: What were some of the biggest challenges in conducting this analysis?

A: When you look at two trends that grow over time, in this case performance and computing power, one of the main challenges is untangling what the relationship between the two is, causation, and what is really just correlation. We can answer that question in part because in the areas we’ve explored, companies are investing huge amounts of money, so they’re doing a lot of testing. In weather modeling, for example, they don’t just spend tens of millions of dollars on new machines and hope they work. They do an evaluation and find that running a model for twice as long improves its performance. Then they buy a system powerful enough to do that calculation in a shorter amount of time so that they can use it operationally. That gives us a lot of confidence. But there are also other ways in which we can see causality. For example, we see that some big leaps have been made in the computing power that NOAA (the National Oceanic and Atmospheric Administration) uses for weather forecasting. And when they bought a bigger computer and installed it in one go, the performance really jumped.

Q: Would these advances have been possible without exponential increases in computing power?

A: That is a difficult question because there are many different inputs: human capital, traditional capital, but also computing power. All three change over time. You could say that if you have a trillion times as much computing power, that certainly has the greatest effect. And that’s a good intuition, but you also have to factor in diminishing marginal returns. For example, if you go from no computer to one computer, that’s a huge change. But if you go from 100 computers to 101, the extra profit won’t be that much longer. So there are two competing forces: on the one hand, a large increase in computers, on the other, diminishing marginal benefits. Our research shows that while we already have tons of computing power, it’s growing so fast that it explains much of the performance improvement in these areas.

Q: What are the implications of the delay of Moore’s law?

A: The consequences are quite worrying. As the computer gets better, it improves weather forecasting and the other areas we’ve studied, but it also improves on countless other areas that we haven’t measured, but are nonetheless critical parts of our economy and society. When that enhancement engine slows down, that means all those follow-up effects slow down as well.

Some may disagree, arguing that there are many ways to innovate – if one path slows down, others will compensate. On some level that is true. For example, we are already seeing an increasing interest in designing specialized computer chips to compensate for the end of Moore’s law. But the problem is the magnitude of these effects. The benefits of Moore’s Law have been so great that in many application areas other sources of innovation will not be able to compensate.

Leave a Comment

Your email address will not be published.