From my previous post you know that I have a Fluke 8050A digital multimeter (DMM). I also have a couple of other bench meters, an HP 34703A/34750A and a Philips PM2422A, as well as a BK Precision 2709B handheld DMM.
I have been giving some thought as to how I can keep these instruments adjusted to some kind of reasonable accuracy. Except for the BK Precision meter, all of these measurement tools are pretty old (the HP and the Philips are 40!). So, are they accurate? Comparing them with the BK (my newest) shows that the Fluke is in good agreement but the the HP is “out”. So, do I calibrate everything to the BK? (I did just that with the Philips.) Better would be to have some kind of standard, so I built one. I’ll get into the details in a moment. But first some observations about accuracy and resolution.
The Fluke 8050A is a 20000 count meter. The count indicates the display resolution. Here, 20000 means that the meter can display any number from 0 to 19999. That number can indicate 0 to 1.9999 Volts or 0 to 199.99 Volts, or Ohms or milliAmps, whatever, depending on the meter setting. The HP 34703A/34750A meter has 200000 counts (pretty high resolution), the Philips PM2422A has 2000 counts (not that much resolution but it has a Nixie tube display which is cool) and the BK Precision 2709B has 6600 counts (yes, it can display from 0 to 6599). If the Fluke is in it’s 20kOhm range, the least significant digit is showing 1 Ohm. The HP in the same range displays to a tenth of an ohm, ten times more resolution.
But resolution is not the same as accuracy. The Fluke meter has a stated accuracy (for the 20 V range) of +/- .03% + 2 counts. That means that exactly 10 volts could be displayed as anything from 9.995 [(10 – .03% = 9.997) – 2 counts] to 10.005 [(10 + .03% = 10.003) + 2 counts]. So that last digit is really kind of an “about” indicator. The HP has ten times the resolution so it is also ten times more accurate, right? Well, wrong. The stated accuracy of the HP in its 10 V range (0-19.9999 volts) is +/- .04% + 10 counts. That means that the HP could display exactly 10 volts as 9.9950 to 10.0050. The same as the Fluke. Actually, this level of accuracy is quite good. For the Philips it’s +/- .1% + 2 counts and for the BK Precision, +/- .5% + 2 counts. For most of my work, absolute accuracy is not that important. But if the meter is too far out of specification I might be putting more (or less) voltage into my circuit than I want. So, maybe absolute accuracy is not so important a but a reasonable level of accuracy is.
Back to resolution. So, what good is that HP with more digits than its accuracy needs? Sometimes, a high resolution measurement can be pretty useful (even if the accuracy is poor). As an example, let’s say I need a voltage divider based on two matched resistors with a value of 10000 (10k) Ohms. In most cases the actual value is not that critical. If the resistors are 9000 or 11000 (+/- 10%), that would be fine. But they must be the same value or as close as I can get them. Then resolution is a nice thing to have. I may need to measure a lot of resistors to find two that match to the resolution of the 200000 count HP meter (which is a resolution of .1 Ohm for a 10k resistor!) but the matching would be be very good.
Another aspect of measurement is repeatability. This means that when I measure something and get a certain value, I should get the same value if I measure it again. Repeatability (and accuracy) depends on a lot of factors, such as good measurement practices and things like temperature, etc., but I won’t be getting into that here.
A voltage reference
A reasonable level of accuracy is important and that leads to the main point of this post. How can I adjust my meters so that I can have a reasonable level of confidence that the values they report are correct? I could send one of the meters to a calibration service and then use that to adjust the others, but that is expensive, costing more than I paid for the thing in the first place. And as I said, absolute accuracy is, at least for me, not that important. So, I decided to build a simple standard based on the Maxim MAX6350 precision voltage regulator. This is a 5 Volt regulator with an initial tolerance of .02%. This is only slightly better than the accuracy of the Fluke or HP, so not good enough to calibrate these meters to the accuracy they are capable of (a standard should be at least a factor of ten more accurate than the meter to be calibrated). But good enough for my purposes. The Maxim datasheet for the device shows a circuit using an opamp voltage inverter to provide + and – 5 Volt regulated outputs. That can also be used as a single 10 Volt regulated output accurate to pretty close to .02%. That is what I built. The power is supplied by two 9 Volt batteries. I used an OPA2134 dual op amp from my parts bin, one half for the voltage inverter and the other half as a comparator to warn when the (positive-side) battery voltage falls below 8 Volts (the minimum input voltage for the regulator according to the datasheet).
One more point regarding accuracy versus resolution. Consider the adjustment of the voltage inverter. It doesn’t need an accurate meter at all. Just adjust the inverter output to be the same as the reference output to the best resolution possible.