Quote:
Originally Posted by G0HZU_JMR
I just don't see the thrill or relevance of this stuff. DMM technology was boringly brilliant over 30 years ago and few people need this level of performance even today (myself included).
|
"Boringly brilliant" and with a "level of performance" which may be superfluous as maybe, but all test kit needs to have its calibration checked from time to time.
That is relevant, surely?
In the attached article, the max. and min. voltages shown on those DMMs are 10.21 and 9.85. That's a difference of 0.36 v., (
) and as a percentage of 10.00 v., 0.36 is 3.6%. To me, even with my humble needs, that is a significant error - and could easily be relevant in some measurements.
Alternatively, let me put that this way: suppose I need to adjust something to be 10.0 v. Generally, I'll readily settle for anything between 10.10 and 9.90, but 10.21 and 9.85? No: not if I'm using something expensive like a Fluke DMM (I own two) which have two decimal places (or more) of display.
Al.