Can we take cost out of technology scaling?

At The ConFab, IBM's Gary Patton spoke about the future of scaling and concluded that it we will continue scaling with new technology innovation, but we have to figure out how to drive the cost out.

There is much talk these days about continued scaling, including some recent posts by my colleague Ed Korczynski, in “Moore’s Law is Dead” Part 1 (What?) and Part 2 (When?). At The ConFab in June, keynote speaker, Dr. Gary Patton, vice president, semiconductor research and development center at IBM, talked about scaling, adding some historical perspective. I previously blogged about the “three fundamental shifts” that Patton believes will lead to a bright future for the semiconductor industry.

“We will keep scaling,” he said. “We have shown a tremendous ability to innovate and keep moving that technology forward.”

In the 1990s, Patton notes that life was actually pretty simple. “You brought in a new lithography tool, you scaled the horizontal dimensions, you scaled the vertical dimensions and you got a new technology out. It was better performance, the same power density, and you could do a lot more on the chip,” he said.

Around 2000, we hit the gate oxide limit. “Gate oxide got to be abount three atomic layers. We could have said at that point ‘game over, scaling has ended.’ But guess what, we innovated. We came up with a pretty fundamental shift in ideas which is let’s change the fundamental properties of silicon. If we can strain the silicon, we can enhance the mobility. We can change the gate oxide. We can enhance the coupling between the gate and the channel. And that’s what we get over that last decade. We said let’s go from SRAM to a very high performance eDRAM (embedded DRAM) so we can put a lot more memory next to the processor because we knew memory was a key gating factor for the processor speed. This enabled the personal computing era and smart consumer electronics,” Patton said.

In 2010, we were at another one of these inflection points. “It’s not surprising that the improvements in 20nm are less than people would like because we really reached the end of the planar device era. Again, we were saying ‘game over, we’re done scaling.’ But no, we continue to innovate. The next decade is really about 3D. 3D devices, finFETs, or 3D chip integration,” he added.

Patton said that design technology co-optimization will be a key piece of getting through the next decade. “That will probably take us to about 2020,” he said. At that point we’re going to “hit the atomic dimension limit and we’re going to have to do it all over again. Here, we’re going to get into nanotechnology. Nanowire devices, silicon nanowires, carbon nanotubes, photonics and multi-chip stacking to bring things together. That will enable wearable computing, everywhere connectivity and cognitive computing.”

Patton said the problem is not physics. “We’re going to have solved the physics problems,” he said. “The problem is financial.” Patton showed a chart (below) that depicted the history of our industry from 1980 to present. “What drove the industry was smaller features, which enabled better performance and better cost per function. It enabled new types of applications, and that enabled larger markets. If you look in this time period, there’s about a six order of magnitude improvement in cost per transistor and that enabled a seven order of magnitude increase in consumption of silicon transistors,” he said.

The challenge we’re facing right now is depicted below, showing the compound growth rate reduction and the cost of a circuit. On the x-axis is linear scaling. “We’ve typically targeted about a 0.7X linear scaling, which means from an area perspective, you get about 50% improvement. Note the line, 50%, doesn’t go through 50% improvement because with each new technology, there is some increase in complexity. It might be more like 30% improvement at the die level. If we’re really good and provide some enhancements in the technology, self-aligned processes, things like that, we may get it to 40%. So 30-40% is about the range we’ve been getting in terms of the cost per die improvement as we scale up.

“The challenge we’re facing now is two fold. Number one, we’re struggling to get that 0.7X linear scaling. It might be about 0.8X. And we’re adding a lot more complexity, especially when you adding double and triple patterning .The focus today in innovation has got to be heavily focused on ‘how do we drive cost?’ Not just how do you scale, because scaling would add a lot of extra cost at this point. How do we drive cost down, how do we keep adding value to the technology. The model is changing. Moore’s Law can still hold, but we have to focus on the cost equation. So there’s really two parts. Technology innovation which is focusing on the patterning, focusing on the materials, the processing, and how do we drive that to take cost out of the technology scaling,” Patton said.

Exit mobile version