Is Calibration Necessary?
Robert A. Day
Port Hadlock, WA 98339 - 1658
Calibration is required by national and international standards and most require that a system of periodic calibration and maintenance must exist for any facility doing nondestructive testing. An example of these requirements is in ASTM E-1212 Standard Practice for Establishment and Maintenance of Quality Control Systems for Nondestructive Testing Agencies  . So why would we even need to ask the question?
One answer is that many institutional and research organizations do not have periodic calibration and maintenance programs. They are not required to have such programs except for safety related work . They do however have NDT labs and my purpose here is to examine the wisdom of neglecting periodic calibration and maintenance for periods of 15 or 20 years as has occurred at one institutions.
I must define carefully my use of the word "calibration" because it has caused significant confusion due to the use of reference standard calibration in NDT. This is the common use of known materials to check the performance of the instrumentation prior to and usually after most NDT operations. I do not mean this form of calibration but instead refer to a periodic and more thorough check on the instrumentation performance, usually with the express purpose of assuring that the accuracy and precision specifications of the measuring instrument are met. I will largely restrict my discussion to ultrasonic flaw testing but will briefly touch on the impact on other NDT methods. In ultrasonic NDT ASTM E-317 Standard Practice for Evaluating Performance Characteristics of Ultrasonic Pulse-Echo Testing Systems Without the Use of Electronic Measurement Instruments  would represent this type of "calibration." I also refer to calibration of gages used to maintain NDT equipment such as voltmeters, oscilloscopes and other laboratory test and measurement equipment.
Flaw detection equipment often is used with reference standards that assure minimum flaw detection sensitivities are met, however there are two problems with this approach as actually practiced.
Flaws often do not behave like the reference reflectors used in the reference standard.
The variation in sizes of reference reflectors is not great enough to check vertical linearity and no effort is usually made to check horizontal linearity during the reference standard "calibration."
The first issue has to do with the inherent uncertainties in flaw detection and sizing that are implicit in any measurement process. Failure to assure that instruments maintain consistent performance through time is going to increase the uncertainty of this process. Changes not detected by the reference reflectors because those changes do not influence the signals from those reflector, is one possibility from not maintaining ultrasonic equipment. Changes in the frequency content of received signals are an example and changes in gain linearity are another.
Fig. 1: Illustration of one type of
The second issue has to do with the monotonicity of the ultrasonic receiver. The existence of non-linearity is well known to exist in ultrasonic receivers and reject is a form of "benign" non-linearity deliberately introduced into the receiver on many instruments. The introduction of non-linearity alone doesn't represent a serious threat to flaw detection although it can affect flaw sizing as shown in Fig. 1. This condition is still monotonic, only one value of output signal corresponds to each value of input or echo amplitude. Figure 2 illustrates lack of monotonicity. Although this problem may seem unlikely it has been observed in some well maintained instruments . The potential for this type of problem in equipment that has received no maintenance in 20 years is unknown.
Fig. 1: Illustration of non-monotonic type of
The consequence of such a behavior is that a larger flaw can be detected as a smaller one even if reference standard calibration has been done religiously. This could in principle obviate safety margins since the exact amount or nature of the non-monotonic behavior is unknown. Customary safety margins are based on stress and fatigue analysis that usually determines a conservative flaw size and then requires detection of all flaws larger than some flaw smaller than this. Regardless of critical flaw size used or safety margins employed this behavior can and would obviate the safety of any ultrasonic method employed.
Are other methods of NDT subject to the same sort of problem? The answer is certainly yes but to greater or lesser amount depending on the method. Some effect are:
Radiographic methods usually image the object and nonlinear behavior in amplifiers would not apply to film. Film can behave this way but current safe guards on exposure and contrast prevent this from happening. Additionally densitometers are calibrated against multi-density step wedges that are of known relative value throughout the range. Overall uncalibrated measurement instruments seem to represent only minor problems with radiography. Some problems exist on radiation energy and current since if you don't calibrate these they will eventually become inaccurate and cause exposure inconsistencies. Few safety problems seem to exist that would not be obvious on the image. Non imaging methods like DXT are subject to the effects of nonlinear gain however and could be suspect under these circumstances.
Dye penetrant and magnetic penetrant are unaffected by calibration issues since they rely on basic principles of physics for most QA factors. Black light measurement for fluorescent measurements is one exception since meters can drift and after sufficient time might indicate adequate illumination when there is not. Use of multiple meters may provide some protection.
Eddy current uses amplifiers whose linearity and monotonicity are of concern in much the same way as ultrasonics. The existence of eddy current instruments with amplifiers that change gain linearity significantly are not known to the author but most manufacturer recommend calibration every year.
Mitigating circumstances exist for some applications that help improve the situation for ultrasonic, eddy current and other electronic NDT methods. Some of these are:
Use of large numbers of reference reflectors. An example is in ultrasonic wall thickness where step wedges are used which feature more than five thicknessÍs. Although most organizations having such reference standards, have only one it often would cover a range of thicknesses in several materials. The risk still remains that the instrument is nonlinear in some other part of it's range and this goes undetected. The risk is reduced by such reference standards but not eliminated.
Imaging methods are less likely to have strong non-linearity undetected by the operator, when the organization has a long track record and the non-linearity occurs during imaging of known objects. Strong non-linearity would be noticed by even the most naive of operators. Long periods of non-maintenance would of course present the possibility that the instrument was always nonlinear and no image anomaly are noticed. Again risk is reduced but not eliminated.
I have discussed risk in a very abstract sense. A legitimate question is what magnitude are these risks? I have no way to estimate them. Antidotal evidence is that ultrasonic instruments become non-monotonic if excess gain and reject is used in only a small percentage of instruments. The implication is that such behavior is probably possible due to drift in the components used in instruments today. These components can fail gracefully, i.e. without disabling the instrument, and create unexpected behavior. The manufacturers generally recommend maintenance annually along the lines of ATM E-317 and have not designed the instruments to operate without such checks for 15 or 20 years. Test data to determine such long term behavior is not generally available. Risk is basically unknown and not easily amenable to analysis.
Lastly I would like to consider the effect on quality of work implied by not calibrating the NDT instruments or the instruments used to maintain them. Increasingly NDT is called on to perform measurement rather than just sorting between good and bad. The consequence of a bad good decision, accepting a flawed component, is serious while rejecting a good component is only economic. But if measurements are to be useful they must be used. An erroneous measurement will result in errors in it's use and can degrade the quality of subsequent work. The practice of having few calibrated instruments in the NDT lab certainly represents a significantly higher risk that quantitative work will not succeed.
My conclusions are that failure to calibrate and maintain NDT equipment represents a safety risk and a risk to the integrity and credibility of those institutions that choose to ignore this requirement.
- 1.1994 Annual Book of ASTM Standards, Section 3, Metal Test Methods and Analytical Procedures, Vol. 3.03, Nondestructive Testing.
- 2. DOE Order 5700.6C, Quality Assurance.
- 3. R. A. Day, Nonlinear Behavior of Ultrasonic Receivers, Second NBS Symposium on Ultrasonic Materials Characterization, Gaitherburg, Maryland, June 1980.
Date created: 10/27/96
Last modified: 3/5/05
Copyright © 1995, 2005 Second Sound
See Vintage Costume Jewelry
See Resource site for everything to do with safety. Full of newsletters, articles, links and other resources - ALL FREE - in one easy to navigate site to save time and money.