- The simplest form of calibration
- Involves measuring a single point against a standard and correcting for sensor offset against this point
- The offset correction is then applied in a constant manner to all the measurements returned by the instrument
- Useful if a system exhibits offset drift and linearity
- Often known as zero-point calibration or “zeroing”
- Only ensures a single point is correct against the standard but cannot correct for incorrect linear gradients
- May result in inaccurate measurements at the extreme of the measurement system range