Consistency Concerns Loom Larger as Chips Get Smaller

Earlier this year, a new generation of very small transistors was released, reducing board space by 50% and raising the hopes of mobile device manufacturers as they build ever more compact smart phones, wearable watches and other e-gear requiring smaller, thinner form factors. The question now for chip manufacturers, is how far can you go in the future by reducing the geometries of chips without sacrificing consistency of performance?

“Variability is one of the biggest challenges when CMOS devices are scaled to meet the demand for portable electronics with increased functionality,” said Chris Hobbs, CMOS scaling program manager at SEMATECH in a website post. “The problem is that device sizes have been downscaled to the point where the electrical properties of individual devices are very sensitive to small changes in their material properties. The classic example of this is random doping fluctuation, where the position of individual doping atoms can cause nearby transistors on a chip to behave differently, even though they were designed identically.”

There are numerous contributors to variability in chip manufacture. In the manufacturing process itself, the engineered designs differ from their physical renderings because of limitations in the manufacturing process such as lithographic distortions, thickness variations resulting from chemical mechanical polishing (CMP), and unevenness in film deposition.

Smaller chips?

Smaller chips?

As chip and transistor geometries continue to get smaller, transistor electrical properties are affected by random sources of variation, such as the roughness of a transistor's edges or the granularity in the crystal of the metal electrode that turns a transistor on or off. This produces variability because one transistor may end up being slower while its neighbor becomes speedier but also leaks more current. In other cases, atomic dopants that are added to a silicon channel to speed up the switching of a transistor while also decreasing energy consumption are much harder to regulate at the new atomic scales at which transistors are being built.

“The culprit is scaling,” says Miguel Miranda, a staff engineer at Qualcomm. “Chips have improved because their transistors and connecting wires have kept getting smaller, but now they're so small that random differences in the placement of an atom can have a big impact on electrical properties. Some batches vary so much that more than half will run 30% slower than intended or consume 10 times as much power as they should when on standby.”

What can be done about the challenge of shrinking geometries in chip design and manufacture? Miranda says that new design techniques using statistical methods that project the tradeoffs between how fast chips will run and how many good chips a given batch is likely to yield are being adopted by manufacturers of high-end microprocessors like IBM and foundries like the Taiwan Semiconductor Manufacturing Company. In other cases, fabs and foundries are testing the limits of electrical property stability in chips by deliberately fabricating test chips that are significantly out of spec.

Another approach is to facilitate ways in which defective chips can be integrated (and used) in smart phones and other devices by moving away from the 100% error-free computing paradigm that presently is an industry standard.

“We recommend getting past the paradigm of 100% reliability operation in wireless communications circuits,” said Andreas Burg, an assistant professor at EPFL (Ecole Polytechnique federale de Lausanne), where he is the head of the Telecommunications Circuits Lab (TCL). “By exploiting algorithms and system-level fault tolerance, manufacturers could accept chips with a certain number of defects to maintain a good yield with significantly reduced power. We could imagine manufacturing 'hybrid' cheaper mobile phones that can switch from one mode to another, depending on what is required. To go even further, cheaper devices could be constituted only by defective chips, under the condition that the client agrees for his phone to be slightly slower,” Burg said on the school’s site.

Changing a commitment to 100% reliability takes some getting used to in a high-precision business like semiconductors, and even the best statistical projections of optimal chip speed versus chip failure and waste rates won't entirely meet target. These realities may well pave the way for new technologies and approaches that come from entirely different directions, and without self-imposed limitations that factor in current restrictions in materials and manufacturing.

4 comments on “Consistency Concerns Loom Larger as Chips Get Smaller

  1. _hm
    July 19, 2014

    Yes, these are new challenges. One way to mitigate is evolve optical processing. Quantum computing is also a new trend.

  2. Anand
    July 30, 2014

    What you are saying is that malfunction risks increase with reduction of size. But we are forgetting that with better technology, chip architectures will be changing too. This reduction of size is good as long as mobile manufacturing is concerned, but what about those places where chip size is essential to have a desired power output?

  3. Anand
    July 30, 2014

    @Rich: Or we can just think new things to reduce costs while maintaining product quality. We need more ideas for this, and this can be facilitated by engineers. We also need more designs, which can be facilitated by electronic architects. We also cannot compromise of quality, for that we need Quality Testing people.

  4. Mary E. Shacklett
    August 4, 2014

    It's a good point, because all of the talk about mobile and shrinking footprints does not answer all application needs.

    Manufacturers and designers know this, too. 

Leave a Reply