- Other Fluke companies:
- Fluke
- Fluke Biomedical
- Fluke Networks
- Fluke Process Instruments
Pressure calibration specifications - A rude awakening!
I initially thought interpreting electrical specifications was a little daunting. But once you get the basics, it’s really pretty straightforward. Specifications in dc/low frequency electrical calibration are based around a simple “y = mx + b” linear equation for both sourcing and measuring instruments. Manufacturers’ use terms like “% of reading + % of full scale” or “% of reading + units”, all based on two terms, a scale factor error (“m”) and an offset or floor “b”. Really, pretty straightforward stuff once you figure it out. One caveat though, always, always read the footnotes. You may be surprised that your 5 ppm + 3 uV specification only applies on certain days of the month, after doing “xyz” procedure, and when the instrument is sitting in a horizontal position. Of course I’m exaggerating, but you get the point.
- % full scale
- % reading
- % span
- % autoranged span
- % Q-RPT span
- And the list goes on, and on, like % BSL, % terminal point accuracy, % span terminal point accuracy, % total error band
% Full Scale – Specification is a percentage of the maximum pressure. Example: +/- 0.05% FS on a device with a range of -10 to 10 psi. The value would be 0.05% X 10 psi = +/- 0.005 psi throughout its entire range.
% Reading – Specification is a percentage of the current pressure. Example: +/- 0.05% on a device with a range of -10 to 10 psi. If the pressure is 5 psi, then the value would be +/- 0.0025 psi. When the pressure is 10 psi, then the value would be +/- 0.005 psi. You do have to be careful when evaluating a % Reading spec for instruments whose range includes zero or points near zero, since pressure at and near zero must include a constant accuracy term (the offset or “b” in the electrical world).
% Span – Specification is a percentage of the overall span. Example +/- 0.05% span on a device with a range of -10 to 10 psi. The value would be 0.05% X (10 – (-10)) = +/- 0.01 psi.
% autoranged span – While the instrument is vented, the user can specify an “autoranged span.” During the resulting test, the user would not be able to go to pressures outside of this span. The specification would be a percentage of this autoranged span. For example, you are using a controller with this type of specification and a full scale of 1000 psi in order to calibrate a device with a full scale of 500 psi. You can set the autoranged span to 500 psi instead of 1000 psi. Your specification would now be a percentage of 500 psi instead of 1000 psi, cutting it in half.
% Q-RPT Span - Q-RPT refers to the physical sensor that is installed in the instrument. It’s the same as % span, but adds clarification when there is more than one Q-RPT installed in the instrument or to differentiate from autoranged span.
What are your thoughts on pressure calibration specs? Is there a preferred method that works for you?
- Login or register to post comments
- Printer-friendly version »
- Home
- Products
- New Products
- Electrical Calibration
- RF Calibration
- Data Acquisition and Test Equipment
- Temperature Calibration
- Humidity Calibration
- Pressure Calibration
- Flow Calibration
- Process Calibration Tools
- Calibration Software
- Service and Support
- All Calibration Instruments
- Handheld Test Tools
- Purchase Info
- News
- Training and Events
- Literature and Education
- Service and Support
- Service Request (RMA)
- Service Plans
- Technical Support
- Knowledge Base
- Accreditations
- Authorized Service Centers
- Calibration Certificates
- Community Forum
- My MET/SUPPORT
- Product Manuals (User Guides)
- Safety Data Sheets (SDS)
- Recycle Program
- Safety, Service, and Product Notices
- Software Downloads
- Warranties
- Tools
- About Us
Comments
Great Article -
from a users point of view the biggest pain is the specmanship that manufacturers do to try and out spec each other, which is best 0.005% of RDG + 0.01% of FS or 0.01% of RDG + .005% of full scale ? so when this happens it becomes a numbers game to work out which has the best accuracy. Which is the dominant factor in each example?
Plus lets not forget about temperature and the effect it has on the final reading of your device.
As a lot of devices are not temperature compensated so the effect of ambient temperature. The difference between your devices's calibration temperature and your actual ambient or room temperature. This can have a massive effect on your readings.
Normally listed as an percentage adder per deg C . ie for every one Deg C away from 20 Deg C you need to add a number ie 0.02% FS basically another inaccuracy and if you working at 5 C on morning your device could be twice as inaccurate as you think!
Very very few manufacturers take this into consideration so its up to us users to allow for it.
chris