There is a lot of industry buzz these days about big data and the ways it can be leveraged to improve a company's overall business intelligence processes. While the industry consensus is that big data is a foregone conclusion in the supply chain and the semiconductor industry, what is not so clear is how this big data can and should be used.
Semiconductor manufacturing test is a prime example, where enormous amounts of data are being generated from multiple steps in manufacturing operations such as wafer sort, final test, and system-level test. These test operations are essential to ensure that defective parts are not being shipped to end-customers. This test data is also used for other types of downstream analysis, but as the amount of parametric data generated continues to grow, and device volumes increase, the ability of companies to effectively mine their data becomes a significant challenge.
For companies with high-volume products, the amount of data generated each month can easily run into the tens of terabytes. There is amazing value in that data, but if you cannot make clear, timely decisions based on that data, all it is doing is sitting around and taking up storage space.
Driving improved data analysis is critical to the forward-looking semiconductor marketplace. End-market customers are demanding more and faster “everything” from their semiconductor providers. One example as to how truly usable data can drive real value is with escape prevention. If you have a part that tests “good” but is at the margin of what is defined as a good part, what do you do? Do you ship it or hold it back?
The ability to rapidly analyze the entire DNA history of a specific device to drive an informed decision for how best to position that part for a given end-customer is a powerful way to provide greater value and differentiation against your competitors. Moving forward, having the capability to mine data across all aspects of manufacturing to proactively control the quality and performance of the products that are shipped will be become the norm, not the exception for the semiconductor industry.
Most methods to manage test data are based on legacy, proprietary solutions. In order to successfully address the demands of the end-markets they serve, semiconductor companies will need to look “outside the box” and explore new, commercially developed manufacturing intelligence solutions in lieu of existing, internally developed ones.
Commercial solutions bring together three key components that are not typically found in proprietary solutions: a comprehensive data infrastructure that can collect and validate manufacturing data from any point in the semiconductor supply chain; a powerful and automated data analysis engine that can rapidly mine the data for relevant information; and, finally, the means to perform cross-operational analytics that can leverage input from multiple data sources to make highly informed decisions at all stages of manufacturing operations.
Forward-looking companies are already using commercial solutions in place of their internally developed ones to leverage their global data sources to drive significant improvements in operations, product planning, and distribution. They see big data, not as a looming problem for their IT departments, but as a never-ending source of employable data that can fuel their success in the marketplace.