SOP of Calibrationin Instrumentation

SOP of Calibration in Instrumentation – Complete Field Guide
Instrumentation · SOP Series

SOP of Calibration
in Instrumentation

Standard Operating Procedures for field engineers and instrument technicians

📅 February 2026 8–10 min read 🛠 6 SOPs Covered 🏷 ISA · IEC · ISO Standards

Every instrument in a plant — from a simple pressure gauge to a smart pH sensor — drifts over time. Heat, vibration, process chemicals, and age all shift the reading away from truth. Calibration is the process of comparing that reading against a known reference and correcting any deviation. Without it, you are flying blind.

This guide covers the complete Standard Operating Procedure (SOP) for the six most common calibration tasks in industrial instrumentation. Each SOP follows the same field-proven structure: purpose, required equipment, safety precautions, step-by-step procedure, acceptance criteria, and documentation.

▶ Table of Contents
  1. Pressure Transmitter Calibration SOP4–20 mA · HART · DP/GP/AP
  2. Temperature Transmitter Calibration SOPRTD · TC · Head Mounted
  3. Control Valve Calibration SOPPositioner · Stroke · Bench Set
  4. Pressure Gauge Calibration SOPBourdon Tube · Diaphragm
  5. Vacuum Gauge Calibration SOPPirani · McLeod · Compound
  6. pH Sensor Calibration SOPBuffer · Slope · Offset
01
Pressure Measurement

Pressure Transmitter Calibration SOP

Purpose: To verify and adjust the output of a pressure transmitter so its 4–20 mA signal accurately represents the calibrated pressure range, ensuring reliable process monitoring and control.

Frequency
6 months / as required
Standard
IEC 60770, ASME B40.100
Accuracy
±0.1% of span
Output
4–20 mA / HART

Required Equipment: Dead weight tester or calibrated pressure hand pump, precision digital multimeter (0.02% accuracy), HART communicator, reference pressure gauge (0.1% accuracy), isolation valve, bleed valve, calibration record sheet.

⚠️
Safety First: Isolate the transmitter from the process before starting. Issue a Permit to Work (PTW). Bleed all residual process pressure. Wear PPE appropriate to the process fluid — especially if the medium is toxic, flammable, or high-temperature.
ST-01
Isolation & Venting: Close the root valve, open the equalising valve (for DP transmitters), then slowly open the vent/drain valve to release process pressure. Confirm zero pressure with a reference gauge before touching any fitting.
ST-02
Connect Equipment: Connect the calibrated pressure source to the high-pressure (HP) port. Connect the multimeter across the 4–20 mA loop or use a loop calibrator. For HART instruments, connect the communicator in parallel with the loop.
ST-03
Zero Check: With zero pressure applied, confirm output reads 4.000 mA. If not, adjust the zero trim via the HART communicator or local zero screw. Record the as-found value before any adjustment.
ST-04
5-Point Upscale Calibration: Apply pressure at 0%, 25%, 50%, 75%, 100% of calibrated span. At each point, record the input pressure and the mA output. Allow 30 seconds for stabilisation at each point.
ST-05
5-Point Downscale Calibration: Reduce pressure back through 100%, 75%, 50%, 25%, 0%. Record outputs again. The difference between upscale and downscale readings determines hysteresis.
ST-06
Span Trim (if required): If the 100% point shows error beyond tolerance, perform span trim via HART communicator. Re-run the 5-point check after adjustment.
ST-07
Reinstate & Document: Remove calibration equipment, reinstall process connections, slowly open root valve, confirm transmitter reads correctly in the DCS/PLC. Fill in the calibration certificate with as-found and as-left values, date, technician name, and next calibration due date.
✓ Acceptance Criteria
  • Zero error: ≤ ±0.1% of span (±0.016 mA)
  • Span error: ≤ ±0.1% of span
  • Hysteresis + repeatability: ≤ ±0.1% of span
  • Output at 0% = 4.000 mA ± 0.016 mA
  • Output at 100% = 20.000 mA ± 0.016 mA
02
Temperature Measurement

Temperature Transmitter Calibration SOP

Purpose: To verify and correct the output of a temperature transmitter (connected to RTD or thermocouple) so its 4–20 mA signal correctly represents the calibrated temperature span.

Frequency
6–12 months
Standard
IEC 60751, ASTM E230
Accuracy
±0.1°C (RTD) / ±0.5°C (TC)
Method
Dry Block / Bath / Simulator

Required Equipment: Temperature calibrator (dry block or liquid bath with ±0.05°C accuracy), resistance/mV simulator or HART communicator, precision thermometer (reference standard), loop calibrator/multimeter, connection leads.

⚠️
Safety: If calibrating in-situ in a hot process, use insulated gloves. For transmitters on steam or heat transfer lines, fully isolate before removing the sensor. Allow the thermowell to cool before sensor removal.
ST-01
Method Selection: Choose Simulation Method (inject known resistance/mV directly into transmitter input terminals — faster, no process disturbance) or Comparison Method (insert sensor in dry block beside a reference standard — tests full system). Simulation is preferred for routine calibration.
ST-02
Simulation Setup (RTD): Disconnect the RTD from the transmitter input terminals. Connect a precision decade resistance box or RTD simulator. Set the simulator to the resistance value corresponding to the Lower Range Value (LRV) temperature per the PT100 table.
ST-03
Simulation Setup (Thermocouple): Disconnect TC leads and connect an mV calibrator. Set millivolt output corresponding to LRV temperature using the appropriate TC reference table (e.g., Type K: 0°C = 0.000 mV). Ensure cold junction compensation (CJC) is disabled or accounted for.
ST-04
5-Point Calibration: Apply simulated inputs at 0%, 25%, 50%, 75%, 100% of the temperature span. At each point record the simulated input temperature, DCS indicated temperature, and the mA output. Allow 15–20 seconds settling time per point.
ST-05
Zero & Span Trim: If errors exceed tolerance, use the HART communicator to perform Digital Zero Trim and Analog Output Trim. Re-run the 5-point check to confirm corrections. Never trim without recording as-found values first.
ST-06
CJC Verification (TC only): Measure the temperature at the transmitter terminals. Compare with the CJC reading displayed on the HART communicator. Error should be less than ±1°C. Replace the transmitter if CJC circuit shows large deviation.
ST-07
Reconnect & Verify: Reconnect the sensor to the transmitter. Confirm the DCS reading is stable and plausible for the current process temperature. Complete the calibration record with as-found and as-left data.
✓ Acceptance Criteria
  • RTD-based transmitter: Total error ≤ ±0.5°C or ±0.25% of span
  • TC-based transmitter: Total error ≤ ±1.0°C or ±0.5% of span
  • mA output error: ≤ ±0.1% of 16 mA span (±0.016 mA)
  • CJC accuracy (TC): ≤ ±1°C at ambient temperature
03
Final Control Element

Control Valve Calibration SOP

Purpose: To calibrate the valve positioner so that the valve stem travel accurately follows the control signal (4–20 mA or 3–15 psi), and to verify bench set, stroke, and fail-safe position.

Frequency
Annually / after maintenance
Standard
ISA-75.01, IEC 60534
Signal
4–20 mA / 3–15 psi
Supply Air
Typically 1.4–7 bar (20–100 psi)

Required Equipment: Loop calibrator (4–20 mA source), air supply regulator, position indicator/scale (or LVDT readout), laptop with valve diagnostic software (if smart positioner), stroke ruler, multimeter.

⚠️
Safety: Control valve calibration must be done with the valve taken out of service. Confirm the bypass valve is open and the line is under manual control. Never stroke a valve under live process conditions without authorisation. Isolate supply air before disconnecting any tubing.
ST-01
Bench Set Verification: Apply bench set air pressure range (e.g., 0.2–1.0 bar) directly to the actuator diaphragm using a hand pump. Valve should start to move at the lower bench set value and reach full stroke at the upper. Record actual values — they should match the nameplate within ±0.05 bar.
ST-02
Fail-Safe Check: Remove all air pressure and control signal. Confirm the valve moves to its fail-safe position — Fail Open (FO) or Fail Closed (FC) as tagged on the valve body. This is critical for safety — do not proceed if fail-safe is incorrect.
ST-03
Zero (4 mA → 0% travel): Apply 4.000 mA to the positioner input. The valve should be at 0% open (fully closed for air-to-open, fully open for air-to-close). Adjust the positioner zero until stem position is exactly at the travel stop. Record as-found deviation.
ST-04
Span (20 mA → 100% travel): Apply 20.000 mA. Valve should be at 100% travel. Adjust positioner span/gain if required. Re-check zero after span adjustment — they interact in most positioners.
ST-05
5-Point Stroke Test: Apply 4, 8, 12, 16, 20 mA (0%, 25%, 50%, 75%, 100%). Measure actual stem position (mm or %) at each point using the position indicator. Repeat in reverse. Record hysteresis and dead band values.
ST-06
Smart Positioner Auto-Tune (if applicable): For HART/FF positioners (e.g., Emerson DVC, ABB TZID), run the auto-calibration routine from the communicator or laptop. This automatically sets travel limits, characterisation, and tuning parameters. Verify results against nameplate data.
ST-07
Packing & Stem Seal Check: While valve is stroked, observe the stem packing for leaks. Check for stiction (jerky movement) which indicates packing too tight or actuator friction. Report any abnormalities before returning to service.
✓ Acceptance Criteria
  • Linearity: Actual travel within ±2% of theoretical at all 5 points
  • Hysteresis: ≤ 2% of full travel
  • Dead band: ≤ 1% of full travel
  • Fail-safe position: As per P&ID and valve tag — must be correct
  • Bench set: Within ±0.05 bar of nameplate values
04
Local Indication

Pressure Gauge Calibration SOP

Purpose: To verify that a bourdon tube or diaphragm pressure gauge reads within its stated accuracy class across its full scale, and to identify gauges that need replacement or adjustment.

Frequency
Annually / after impact
Standard
ASME B40.100, EN 837
Accuracy Class
Class 1.0 / 0.5 / 0.25
Method
Dead Weight / Pump

Required Equipment: Dead weight tester or calibrated pneumatic/hydraulic hand pump, reference standard gauge (4× accuracy of test gauge), calibration sticker kit, spanner/wrench, thread sealant (PTFE tape).

ℹ️
Selection Rule: The reference standard must have an accuracy at least 4 times better than the gauge under test. For a Class 1.0 gauge (±1%), use a reference of ±0.25% or better. This is called the 4:1 Test Uncertainty Ratio (TUR) per ISO/IEC 17025.
ST-01
Visual Inspection: Before connecting, inspect the gauge for cracked glass/window, bent pointer, corroded fitting, and damaged dial face. A gauge with physical damage should be condemned — do not calibrate, replace instead. Record the gauge tag number, range, and accuracy class.
ST-02
Zero Check at Atmosphere: With no pressure applied and the gauge vented, confirm the pointer rests exactly on zero. Note any zero offset (e.g., pointer sits at +0.5% full scale). Some gauges allow zero adjustment via a small screw on the face.
ST-03
5-Point Upscale Test: Apply reference pressure at 20%, 40%, 60%, 80%, 100% of full scale. At each point, tap the gauge face lightly to overcome friction (this is permitted per ASME B40.100), then read and record the indicated value versus the reference.
ST-04
5-Point Downscale Test: Reduce pressure through the same 5 points. Record readings. Compare upscale vs downscale to calculate hysteresis. Calculate error as a percentage of full scale: Error% = (Indicated − Reference) / Full Scale × 100.
ST-05
Pointer Adjustment (if available): Many gauges allow span adjustment by changing the link position on the Bourdon tube. This is a workshop procedure — field adjustment is typically only the zero screw. Gauges with non-adjustable errors beyond tolerance must be replaced.
ST-06
Labelling & Return to Service: Affix a calibration sticker showing: date calibrated, next due date, technician initials, and PASS/FAIL status. If passed, reinstall with correct thread sealant. Update the gauge register/asset management system.
✓ Acceptance Criteria
  • Class 1.0: Maximum error ≤ ±1.0% of full scale at all points
  • Class 0.5: Maximum error ≤ ±0.5% of full scale
  • Class 0.25: Maximum error ≤ ±0.25% of full scale
  • Hysteresis: Included within the above class tolerance
  • Zero return: Pointer must return to zero ±½ the class tolerance after pressure release
05
Vacuum Measurement

Vacuum Gauge Calibration SOP

Purpose: To verify the accuracy of a vacuum gauge (compound gauge or absolute vacuum gauge) across its measurement range, ensuring reliable measurement in vacuum distillation, evaporation, and pharmaceutical drying applications.

Frequency
6–12 months
Standard
ISO 3529, ASME B40.100
Range Types
Compound / Full Vacuum
Units
mbar, mmHg, kPa abs, inHg

Required Equipment: Calibrated vacuum pump with controlled bleeding valve, reference vacuum standard (precision absolute pressure transducer or McLeod gauge), vacuum manifold with T-piece, flexible vacuum tubing, multimeter (for electronic gauges).

ℹ️
Key Concept: A compound gauge reads both pressure (above atmosphere) and vacuum (below atmosphere) — range is typically –1 bar to +X bar. A vacuum gauge reads only below atmosphere. The reference standard must cover the full vacuum range being tested. For deep vacuum (below 1 mbar), a Pirani or capacitance manometer reference is required.
ST-01
Atmospheric Reference: Vent the gauge to atmosphere and confirm the pointer reads zero (compound gauge) or reads current atmospheric pressure (absolute gauge). Record the barometric pressure using a barometer — this is your reference for atmospheric point calculation.
ST-02
System Leak Check: Connect the gauge under test and the reference transducer to the vacuum manifold. Pull vacuum to the maximum test point, then isolate the pump. Monitor pressure for 2 minutes — a rise of less than 1% of full scale indicates acceptable system tightness. Repair leaks before calibrating.
ST-03
5-Point Vacuum Calibration: Using the controlled bleed valve, set vacuum to 20%, 40%, 60%, 80%, 100% of the gauge’s vacuum range. At each point, compare the gauge reading to the reference transducer reading. Allow 30 seconds for stabilisation — vacuum systems equilibrate slowly.
ST-04
Downscale Test: Slowly bleed air back into the system through the same 5 points. Record the gauge reading vs reference at each point. Calculate hysteresis — vacuum gauges are prone to higher hysteresis than pressure gauges due to diaphragm/Bourdon tube flexibility.
ST-05
Zero Return Check: Fully vent to atmosphere. Confirm the pointer returns to zero (or atmospheric pressure for absolute gauges). A gauge that does not return to zero after vacuum has a damaged sensing element and must be replaced.
ST-06
Label & Record: Affix calibration sticker with pass/fail, date, and next calibration date. For electronic vacuum transmitters, update the calibration record in the asset management system and confirm the loop reading is correct on the DCS.
✓ Acceptance Criteria
  • Compound gauge (vacuum side): ≤ ±1.0% of full scale range
  • Full vacuum gauge: ≤ ±2% of full scale (due to non-linearity at deep vacuum)
  • Zero return after full vacuum: Within ±1 graduation
  • Hysteresis: Included within stated accuracy class
  • Leak rate during leak test: < 1% of full scale per 2 minutes
06
Analytical Measurement

pH Sensor Calibration SOP

Purpose: To calibrate a glass-membrane pH electrode system against certified buffer solutions, establishing the correct offset (zero point, pH 7.00) and slope (electrode sensitivity, mV/pH) for accurate inline pH measurement.

Frequency
Daily / weekly / per batch
Standard
ASTM E70, ISO 10523
Method
2-point / 3-point buffer
Accuracy
±0.02 pH (lab) / ±0.1 pH (field)

Required Equipment: Certified pH buffer solutions (pH 4.01, pH 7.00, pH 10.01 — NIST traceable), deionised/distilled rinse water, lint-free wipes or KCl soaking solution, pH transmitter/analyser, thermometer (for temperature correction), calibration log sheet.

⚠️
Buffer Handling: Always use fresh buffer from a sealed bottle. Buffers contaminated by dipping a used electrode or exposed to CO₂ from air will give false calibration. Never pour used buffer back into the stock bottle. Check buffer expiry date — expired buffers cause systematic errors that are impossible to detect without a reference electrode.
ST-01
Electrode Inspection & Conditioning: Remove the electrode from the process or storage solution. Inspect the glass bulb for cracks, coating, or fouling. Rinse with deionised water and gently blot dry (never rub — this creates static charges). If electrode was in storage solution, let it equilibrate in pH 7 buffer for 5 minutes before calibrating.
ST-02
Temperature Measurement: Measure and record the temperature of the buffer solutions. pH values are temperature-dependent — the certified buffer value on the bottle is given at 25°C. Use the temperature correction table on the buffer bottle or enable automatic temperature compensation (ATC) on the transmitter using an integrated Pt100.
ST-03
First Buffer — Offset Calibration (pH 7.00): Immerse the electrode in the pH 7.00 buffer. Wait until the reading stabilises (typically 30–60 seconds, longer for aged electrodes). Press “CAL” or “SET BUFFER 1” on the transmitter. The offset (zero point, also called asymmetry potential) is now set. A healthy electrode should show a raw mV reading within ±30 mV of 0 mV at pH 7.
ST-04
Rinse Between Buffers: Remove the electrode from the pH 7 buffer. Rinse thoroughly with deionised water (use a wash bottle). Gently blot dry with a lint-free wipe. This step is critical — carry-over from one buffer to another contaminates the second buffer and gives a false slope.
ST-05
Second Buffer — Slope Calibration (pH 4.01 or 10.01): Immerse in the second buffer. Choose pH 4.01 for acidic process ranges, pH 10.01 for alkaline ranges (use a buffer closest to your process). Wait for stabilisation and press “SET BUFFER 2”. The transmitter calculates slope in mV/pH. A new electrode gives 95–105% slope (ideal: 59.16 mV/pH at 25°C per Nernst equation). An electrode with slope below 90% should be replaced.
ST-06
Optional Third Buffer Verification: For high-accuracy applications (pharmaceutical, food, water treatment), verify with a third buffer at pH 10.01. Do not use it as a calibration point — it is only a check. If the reading deviates more than ±0.1 pH from the certified value, the electrode linearity is compromised.
ST-07
Reinstall & Confirm: Rinse with deionised water, reinstall the electrode in the process holder or flow cell. Allow 5–10 minutes for the electrode to equilibrate to the process temperature. Confirm the DCS/SCADA reading is stable and reasonable. Record offset value (mV), slope (%), temperature, buffer lot numbers, and calibration date on the calibration certificate.
✓ Acceptance Criteria
  • Electrode offset at pH 7: Raw mV reading within ±30 mV of 0 mV
  • Electrode slope: 95–105% of Nernst ideal (56.2–62.1 mV/pH at 25°C)
  • Third buffer verification (if done): Reading within ±0.1 pH of certified value
  • Electrode impedance (if smart transmitter): 50–500 MΩ (outside = replace electrode)
  • Response time to stable reading: Less than 60 seconds (longer = ageing electrode)
📋
Documentation is Non-Negotiable. Every calibration must generate a signed certificate with: instrument tag, location, make/model/serial number, calibration range, as-found data, as-left data, reference equipment (with certificate numbers and traceability), technician name, date, and next calibration due date. Retain records for a minimum of 5 years or as required by site regulations. ATEX/SIL instruments require calibration records to be kept for the life of the device.
▶ Free Download

Download All 6 SOPs as PDF

Save the complete calibration SOP guide — all 6 procedures with acceptance criteria, spec tables, and field checklists — as a print-ready PDF for your site folder or team training.

↓  Download PDF Guide

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top