MetaData: Characterize, Define, and Analyze

The Value in the Details: Best Practices for Manufacturing Metadata

Meta data is defined as the characteristics of product performance or the characteristics of its process.

These meta data points, or attributes of test, are useful indicators of quality, product performance, or process compliance. Metadata, as opposed to the information captured in a data model (which, by definition, is limited and static to enable database storage and warehousing) have values that will vary over time. In addition, analysis of data will result in a need to capture more meta data points.

As an example, a test engineer in NPI could be studying the performance of an RF product. By running individual parametric measurements repeatedly against different environmental or performance conditions, the engineer would determine performance limits, test configuration, and design specification. Over his or her testing process, the engineer could discover that new conditions that affect product or process performance need to be captured and analyzed that weren’t originally written into the test code. As a result, meta data points must be extensible and flexible. 

Meta data is used by engineers to characterize performance of a product. The types of meta data captured in different stages of a product’s life-cycle (R&D vs NPI vs Production vs MRO/RMA) will necessarily be different due to the differing needs of the engineers responsible for product performance and data analysis at each stage. 

We can typically categorize meta data into a few categories:

  • Descriptions of the source of the data
    • Tester location
    • Environmental conditions (temperature, humidity, etc)
    • Hardware revision
    • Hardware ID
    • Software Revision
  • Descriptions of how the data was collected
    • Tester calibration
    • RF testing: bandwidth
    • RF testing: frequency
  • Descriptions of how the data performed
    • Loop count of a measurement
    • XY coordinates of failures
  • Descriptions of what processes were done in response to data performance
    • Failure descriptions
    • Rework descriptions

In the IntraStage data model, metadata can occur at any of three levels: Event, Test and Measurement. In addition, since data can come from anywhere in the product’s life-cycle, it’s critical that metadata is mapped in to allow data investigation from any part of that life-cycle to draw data from any other currently available life-cycle.

Related Topics, Examples, and Links

New Product Integration

Normalizing and characterizing RF data with hundreds of parameters.

Learn How Motorola

On-demand characterization of complex RF data.

READ THE
CASE STUDY

Turning Data Into Action

Learn how Medtronic integrated BlackBelt to gain a full picture of their KPIs.

Real-time visibility

Full system monitoring product quality and process efficiency.

READ THE
CASE STUDY