계측 솔루션
계측 솔루션
Bedrock S.Q.C.
EN-US
 
 

Bedrock S.Q.C.
George Schuetz, Mahr Federal Inc.

  Statistical Quality Control has been in use since the 1930s, when it was performed with paper, pencil, and maybe a slide rule.  As such, it was originally somewhat time consuming, and it required a fairly high level of care and understanding on the part of its practitioners.  SQC got a big boost in the 1970s and '80s, when electronic gages, data loggers, and PCs began to proliferate.  Suddenly, untrained line inspectors could easily perform the necessary calculations, without really understanding the process. Fast forward to today and S.Q.C. is practically free with Integrated wireless Calipers and Digital Indicators.

  Many instructors still believe, however, that inspectors and machine operators should learn to do SQC manually before they plug in their data loggers, on the general principle that people who understand what they're doing tend to do a better job.  Thus, we'll be looking at some of the basics of SQC and these rules can apply to any dimensional gaging procedure in a high-volume production application.
 
  All operations that produce features to dimensional tolerances involve variation.  Variation cannot be eliminated, but it can be controlled so that it remains within acceptable limits.  SQC uses the laws of probability to reliably monitor and control a process.  By inspecting variation in a small sample of production, it is possible to draw inferences for the entire lot.

  Let's use the following simple example: the feature to be inspected is an OD on a rod, and the specification is 0.375" ±0.005".  For our sample, we select 35 parts at random from a certain segment of a production run.  (Let's not worry, for the moment, about how we arrive at that figure.  Suffice it to say that samples need rarely be greater than 50 parts.)

  The first "statistical" procedure is to find the smallest and the largest measurements, then subtract one from the other.  This is the "range," which we might express as R = 0.008".  Next, we find the average of all the measurements (the sum of the measurements divided by the number of measurements in the sample), which we express as the "X-bar" value, as in X = 0.377".

  These very simple procedures provide important information.  First, they tell us whether all the parts in the sample are in tolerance, and how much of the tolerance range (0.010") the variation (0.008") consumes.  And secondly, they show the relationship between the average value and the specification.  The average of a sample may lie exactly on the part specification, but the range of the sample may be broader than the tolerance range, so that many parts lie beyond both the upper and lower tolerance limits.  On the other hand, the range of the sample may be smaller than the range of the tolerance limits, but the average may be skewed so far from the specification that the entire sample falls outside one of the tolerance limits. 

  The next step is to chart the data on a histogram.  A range of measurement values is divided into equally spaced categories, and each measurement is placed in the appropriate category.  If distribution is "normal," the resulting Frequency Distribution Curve will have the familiar bell shape.  The "mode" - the category containing the largest number of data points - will be the same as the average in a normal distribution.  Distribution curves that are not bell-shaped may indicate a problem in the manufacturing process.  For example, if the curve shows a dip where the mode normally appears, it might indicate looseness in the setup or the machine tool.

  The histograms show four examples of normal distribution.  In “A”, the range is too wide, indicating that a large percentage of production falls outside the tolerance limits.  In “B”, the range is equal to the tolerance limits: all the items in the sample pass inspection, but there is a statistical probability that some parts in the production run will exceed the limits.  In both cases, one would want to reduce the range of variation.

  The range of curve “C” is significantly less than the tolerance range, and falls entirely within the specifications: there is a probability of very few bad parts in the run.  In “D”, the range is acceptable, but because the average is displaced to one side, a significant number of sample parts fall outside of tolerances.  Some means must be found to shift the average while maintaining the range.