BIG data is a collection of sets of data too large and complex that it becomes difficult to process using on-hand database management tools. Get this… it is estimated that in 2012 most mid-size companies in the USA generate the equivalent data of the US Library of Congress in 1 year. As a company, Walmart creates the equivalent of 50 million filing cabinets worth of data every hour. While these numbers seem incredible, the trend for most companies is an increasing volume of data generation and storage.
Product Quality Test Data generated by Automatic Test Equipment (ATE) in R&D, Manufacturing and Repair environments is no exception. The challenges of this enormous amount of Test Data is how to provide people with effective way to make product quality decisions from it. Quality today means augmenting the power of traditional SPC (Statistical Process Control) techniques with activities such as ensuring complete visibility into yields, being able to link failures with repair information and having a collaborative platform upon which all functions of an organization (R&D, Manufacturing, Supply Chain etc.) can access this Test Data to quickly root cause issues.
If you would like to know more about these trends and how some Fortune 1000 companies are improving their Product Quality our VP Sales & Marketing will be holding a seminar on…”Using BIG Data to Improve Product Quality for Electronic Manufacturers” at the FREE Del Mar Electronics & Design Show.