Thanks David for the elaborate reply. I have seen Engineers using statistical softwares such as SAS or JMP or R to build pedictive and forecasting models but i think still the predictive models can be much improved taking clues from other industries such as oil and gas, energy, automotive etc.
I also agree that testing and diagnosting chips is a real challenge and if this can be done in real time then we can take corrective course of action to save the whole batch going bad. Most of the time, it is very time consuming to find the bad patterns or burried layers responsible for the failure of the chip.
The semiconductor data is real big data and i think industry will warm up to the potential of the big data techniques and invest in the resources.
@Himanshugupta, great questions that you are posing. I will try to answer them as best I can.
We are also seeing that management sometimes is not interested in investing in new data analysis applications, but that is generally because they have little exposure to the benefits of what truly global data-mining can provide, especially for semiconductor manufacturing and test. Like many big ideas that involve change, it takes a while for people to embrace it, especially when they think that what they have in place is "pretty good".
It does take time to educate and inform people about what is possible to improve in such a mature area as semiconductor manufacturing. But it is happening, slowly but surely. We come up against NIH (not invented here) everyday with manufacturing organizations, but when you can demonstrate real ROI, and show very clear and specific actionable information that can enable manufacturing operations to significantly improve yield, quality and throughput, people start to listen.
Most of what is done in data analytics is valuable across multiple industries, the part that is specific is the type of data that is collected and the rules that identify the problem and a course of action. One of the easiest ways that big data analytics can help semiconductor manufacturing is to quickly identify trends. For example, if you have a test site that was yielding at 95% but all of a sudden in the last hour it is only yielding 85%, you know something is wrong and needs to be addressed. Fixing this as soon as possible can significantly improve recoverable yield. If you cannot detect this for hours or days, you have lost real yield. If you can find it in minutes you are now helping to improve your company's revenue.
On the quality side, one common issue is a tester freeze (for various reasons), now you can be passing off parts as "good" when they haven't been actually tested at all. This can introduce a lot of suspect parts into your supply chain. Again, being able to mine your global manufacturing data in real-time can help you see this problem as soon as possible so that you can fix the problem and avoid having quality issues with your end customers.
In both cases, the source of the data and the "rules" that find both of these examples are industry-specific. However, the underlying data collection, normalization, validation and data analysis that is done is industry-neutral.
For the specific challenges that you are dealing with, we can apply generic big data analytics to help address some specific issues. One thing that is looked at during NPI is scan chain failures. There are many yield analysis tools/companies out there, but one of their challenges is finding real failure data to analyze. They are looking for a needle in a haystack. One application of data analytics is to monitor bad bins. If the bad bin associated with scan failures reaches a certain threshold, you can create "rules" that direct an automatic retest of those parts to capture full scan chain diagnostics, and ten automatically pass that information off to the yield engineers to diagnose. So now those people are getting real information they can troubleshoot. They can spend their time fixing problems not looking for them. This is just one example for how companies can turn their data into actionable information and create a source of manufacturing intelligence that can help many different parts of their company.
@Rich, I don't want to turn this discussion into a promotional pitch for my company. I would be very happy to discuss this in more detail with you and @Himanshugupta in more detail at your convenience. You can reach me at email@example.com.
@Rich, the problems about big data that you have described are actually true. Most of the organizations do not know what to do with their data and most of all there is little understanding on how to deal with big data. However, there are some companies which have used the power of data to understand the problems and increase the revenues. This is still an evolving field and it will take years before the fields gets more formalized.
Thanks David for your insight. Actually i am looking for ways to improve the semiconductor process by using the data that FAB produces. I think specific teams (such as characterization) closely monitor the data to keep yields on acceptable limits but i can imagine that there are many areas where intelligent analysis can be done and problems can be identifies before they actually occur. However, management do not seem to bother to invest in such softwares or algorithms and it is to the discretion of individual employee to improve the system.
Can semicondutor industry learn from other manufacturing units or the problems of semiconductor industry uniques? Also can you please elaborate what big data techniques are actually available to solve semiconductor problems.
@ Himanshugupta, yes the semiconductor industry is maturing, but there is still much that can be learned, and optimized, by data analysis on the manufacturing side. While semiconductor data is not "big data" if you compare it to things like Google or healthcare records, it is still quite large. We are seeing 30+ TBs of parametric data generated every month by our customers. Even this amount of data is very challenging to identify actionable data in a timely fashion.
It is one thing to find a problem, but there is an even bigger challenge to find that problem soon enough to take corrective action that is meaningful for your business. Even for a relatively-mature business like semiconductors, there are still many ways where manufacturing data can be rapidly transformed into intelligent information that can significantly improve many facets of a company's business.
Outlier detection, escape prevention and RMA management are three areas where data analytics are now being used to a much greater extent to drive higher quality and reliability metrics for semiconductor devices. As device complexity continues to rise with smaller process geometries and the adoption of multi-chip packages, the amount of manufacturing data that is generated will continue to grow. Companies that develop core competencies in being able to create "manufacturing intelligence" of out of their "big data" will have a signficant advantage over their competitors.
Like any other manufacturing industry, i think that semiconductor industry has become so mature that there is little or no unstructured data as far as big data is concerned. Most of the processes are mature and the semiconductor hardwares are tested based on random sampling.
I think i missed the point on semiconductor industry a fargone case for big data. Most of the semiconductor companies use data extensively to optimize the procedures and processes. Though i agree that they do not use big data as such because more of the data is dependent so need not to be analyzed in great details.
@Rich: Truly said. Higher streams of data mean higher storage complexity. But with the recent advancements in cloud storage, it is now possible to store Exabyte of data in the cloud, and this only requires little cost. Cloud systems can be managed easily with virtual servers throughout the world.
I think with the recent advancements in the field of IOT, the boundary between Data and Usable Info is getting blurrier by the minute. There is simply too much confidential data in many company's hands to track and we as the end user cannot do anything but hope for the best and pray such that nobody uses the data in a bad way.
The security piece of big data is always going to be critical. Worse, when you bring several disparate streams of data together it can become even more valuable to hackers. Worse, many organizations haven't yet learned how to manage let alone use the stacks fo data sitting around. I suspect there will be some cautionary tales in the media (read huge data snafus) before we get this completely nailed down.
185 million searchable parts
(please enter a part number or hit search to begin)
EBN Dialogue / LIVE CHAT
EBN Dialogue enables you to participate in live chats with notable leaders and luminaries. Open to the entire EBN community of electronics supply chain experts, these conversations see ideas shared, comments made, and questions asked and answered in real time. Listed below are upcoming and archived chats. Stay tuned and join in!