Quote:
Originally Posted by Superscope
One option I was thinking of, would be a 0.1uA Meter and a Calibration Certificate, but what exactly does a Calibration Certificate tell you?
If it only tells you the Meter is within Spec, it doesn't really tell you anything.
|
A calibration is different things to different people or organisations!
It can start at a basic reference check on set points in each range, and a "look up" table to show the errors at these points.
It could be a complete check and re-adjustment to bring into manufacturers specifications which would give fixed degrees of certainty.
If your calibration said it was fully working to manufacturers specification then you can calculate the uncertainty (as you have done, and I did above) and give you the range that the actual voltage/current will lie between.
The better the meter (both accuracy and precision) then the closer your certainty of measurement will be.
So you need to find a meter that gives the resolution and accuracy to say +/- 0.1uA It would therefore need an accuracy and precision about 10X better than this to give you certainty that the indicated value is +/-0.05uA
So you are looking for a 5.5 digit (minimum) dmm to full manufacturers specification. Therefore you are talking about new HP, Fluke quality meters in the £1000+ price range, and a re-calibration how often
When you get to this level of accuracy and precision, then you need to start considering other issues such as lead resistance, noise, and temperature!
So as I was suggesting in my original post you need to understand how much accuracy and what level of certainty you want, compared to how much you want to spend.
If you want to precisely and accurately measure 37.5uA to +/-0.1uA then its going to cost!