In Search of Mission-Critical Semiconductor Quality

Demand for mission-critical semiconductor quality in broad-based consumer products is taxing leading IDMs and fabless companies, and the answer lies in leveraging existing manufacturing data.

For the semiconductor manufacturing market sector, the need to provide high-quality products has always been of paramount concern. However, in recent years, the quality demands on broad-market semiconductors used in many “smart” consumer electronics and high-end automobiles have begun to approach the levels of mission-critical ICs found in the medical and aerospace market sectors.

As these consumer-focused products become more content-heavy to enable a vast array of applications and capabilities, the semiconductor technology incorporated into them is being held to increasingly higher reliability and quality metrics.

For many leading semiconductor companies, they are meeting these intensifying quality challenges by efficiently and effectively leveraging the vast amounts of data that is generated during manufacturing test operations. While this data has existed for quite some time, the ability to benefit from it in very measureable ways has only recently become feasible and practical, creating a “ripple effect” across the entire spectrum of a semiconductor company's overall manufacturing and business operations.

By examining key aspects of manufacturing test operations, it is possible to see how improvement in specific areas can create a more far-reaching “ripple effect.” For example, commercially-available technology now exists to implement superior outlier detection algorithms that can significantly improve the overall product quality in shipped parts. These improved results can be achieved by establishing quality indices that are based upon the results from multiple test operations such as e-test/WAT, wafer sort, final test, and system-level test.

Using production-proven simulation and analysis engines, semiconductor companies can implement real-time manufacturing test decisions based upon the monitoring and implementation of any combination of manufacturing test parameters. Once these indices have been established, the resulting detection algorithms can be deployed across the company's global supply chain operations. It is important to note that the foregoing improvements can be realized without disrupting or replacing existing manufacturing or test operations.

Beyond test operations, successfully collecting, analyzing, and leveraging manufacturing data can significantly improve other essential business operations including materials management, quality control, resource management, product design, and product delivery.

In today's business climate, where there is continual pressure to achieve greater manufacturing throughput while also improving quality, it is essential for semiconductor and electronics companies to examine their existing operations to see how they can be optimized to reach the next level of product quality and reliability.

10 comments on “In Search of Mission-Critical Semiconductor Quality

  1. _hm
    February 26, 2014

    Mission critical semiconductor quality in consumer electronics – Give us few examples.

    Also, it is important to understand mission critical semiconductor has many more aspect to it. Few cent transistor with mission critical quality will cost you few 10s of dollars.


  2. Hailey Lynne McKeefry
    February 26, 2014

    Speaking from a consumer viewpoint, increased reliability and performance are certainly on my list of must-haves for products I buy. There's so much information for consumers that peopel buying electronics are doing more research–and product differentiation is becoming more important than ever before. certainly this is going to get pushed back up the supply chain to the component makers.

  3. Hailey Lynne McKeefry
    February 26, 2014

    @David, what do you recommend as a starting poing for looking for quality improvements? Where is the biggest bang for the buck? Where would returns be more limited? Are there areas that are being more universally ignored? Are there questions that OEMs should be asking their partners to measure all this?

  4. davidpark
    February 26, 2014

    Hailey, there is lots of “low hanging fruit” when it comes to looking for quality improvements, and much of it centers around getting access to actionable data in a much more timely fashion. A simple one is being able to identify a “tester freeze” as quickly as possible. During this situation, parts appear to be tested and are passing, but in reality they are not being tested at all. Without rapid and automated access to manufacturing data, a situation like this could persist for a lengthy period of time allowing defective parts that were really never tested to enter the supply chain.

    There are enterprise-level solutions that can watch over 1000s of testers simultaneously and identify problems like this within minutes, enabling the appropriate people in the supply chain to take action and correct the problem, significantly reducing test escapes and as a result, improving the DPPM rate of the devices being produced as well as product quality of the systems these ICs go into.

  5. davidpark
    February 26, 2014

    The point I was trying to make was that general market consumers are demanding and expecting very high quality in the electronic products they purchase and use.

    If you get a pacemaker installed, you don't have an expectation of having to reboot it every week. You expect it to work flawlessly. Same with your car. If it is -10* and snowing outside and you slam on the brakes to avoid hitting something, you expect the car to stop.

    There is a growing expectation among consumers to have that level of quality and robustness to occur with every electronic device they purchase. To keep customers happy and their brand strength strong, OEMs are putting significant pressure on their IC suppliers to greatly reduce their DPPM rates to levels that were historically the realm of mission-critical applications like automotive ICs so that the overall quality of the products they sell is as high as possible.

    It isn't just the cost of the RMA that is the problem for some of these OEMs, it is the hit to their brand reputation and what they are able to charge customers for future products. That is why semi companies are relentlessly going after higher/mission-critical semiconductor quality for even their consumer-oriented ICs.


  6. prabhakar_deosthali
    February 27, 2014

    In my opinion, Mission-critical semiconductor quality is achievable ! But unless there is a means and processes to measure the cost of quality and trade it off against the cost of component recalls, filed repairs and warranty support becasue of the bad quality – it may be difficult for the technical team to convince the marketing and managment that Quality Costs!

    So customer should be ready to pay for the quality and for that we should know the exact cost of the added quality and relaiability

  7. ahdand
    February 27, 2014

    @David: Exactly. You do need to keep the quality as well since quality plays a major role in retaining customers. Just because the load is huge you cannot bypass certain items with low quality while testing. The job role of a tester should be to test each and every aspect that can occur after being implemented.      

  8. davidpark
    February 27, 2014

    @prabhakar. Yes I agree, the cost of implementing quality does need to be considered. In the example I provided (tester freezes) customers do have to weigh the cost of the solution (the tools) that enables them to catch those issues, but that is a fairly simple case that customers make everyday regarding the purchase of new tools.

    The more challenging cost tradeoff is when you get into more sophisticated quality management such as leveraging RMA data to improve future product quality.

    When RMAs come in, typically there is a team that is determining the cause of the RMA. Assuming it was an actual field failure and the cause is determined, the question is: How do you avoid shipping these failure-prone parts in the future and what would it cost to catch them? (the question you posed).

    One way that customers are addressing the problem AND understanding the cost associated with it is by leveraging the test information collected in their manufacturing process (test DNA).

    Manufacturing intelligence solutions now exist that enable customers to resimulate historical test runs (for known RMA devices) with different test rules in place to see if they could have captured a device that is know to have failed in the field. Those same “virtual” test runs will also show what the net impact on overall yield would have been because you have “tightened the net” on suspected parts.

    As you stated, this is the kind of information that is essential to manufacturing operations: To be able to address a quality/reliability problem (bin highly suspect parts) and understand the financial impact for solving that problem (in this case reduced overall yield) so that they can make an informed decision.

  9. Himanshugupta
    February 28, 2014

    Thanks David for the insightful article. Here you cover both semiconductor manufacting and how to solve the quality problem with data. I am basicallly in both fields as i am an Semiconductor Physicist and data scientist. Can you share some more insights about the unique problems that Semiconductor industry is facing and where it needs to tweak the existing manifactuing solution to cater its needs? also do you know if anyone has use data to solve any of the problems. I know that Lean and six-sigma are popular techniques for production quality and performace but i am taking more about tool breakdown, faulty parts etc.

  10. davidpark
    February 28, 2014

    @Himanshugupta, thanks for your comments. Staying in the context of this article, one of the new challenges confronting the semi industry is the emergence of 2.5D/3D devices and the impact those devices are having on the manufacturing process. Managing yield and quality for new process nodes every 2 years is already a challenging task. Multi-chip packages compound those challenges with an increasing number of data sources (die test data, inspection data, SLT data etc.) that need to be synchronized and analyzed as quickly as possible for operations teams to be able to make rapid decisions that will affect process quality and yield.

    The ability to quickly turn terabytes of raw, manufacturing data into “actionable data” that can enable semiconductor companies to do real-time checks on the stacking process, or understand the interdependencies on the dice in a MCP will have a significant competitive advantage in the marketplace. And to answer your question, there are many companies that have already implemented enterprise-wide test management solutions to take full advantage of their manufacturing data and they are all seeing very strong ROI in their manufacturing operations.


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.