How to Check the Readings on Your Infrared Thermometer

5 min read
image 'MAG
Share
Comment vérifier les lectures sur votre thermomètre infrarouge

The third article of our infrared thermometer series explains how to properly validate the accuracy of an infrared device in the field. Our first article looked at emissivity and how to get an accurate reading, while the second focused on how to clean and store an infrared thermometer. If you haven't already, we also recommend reading these articles to fully understand how infrared works before attempting a calibration.

Calibration vs. validation

The process of calibrating a thermometer can only be done in a controlled laboratory environment. The validation process, where an instrument is compared to check its accuracy, is what is described here. If an instrument's reading is found to be inaccurate when validated using a calibrated thermometer, then it must be sent to a laboratory for repair or re-calibration.

Why validating a temperature on an IR instrument is different from calibrating a penetration probe

Infrared thermometers only measure surface temperatures and should therefore only be used as a quick guide. This is because measurement accuracy is affected by many factors and variables such as surface emissivity, material type, transparency, color and reflectivity (read our complete guide to getting accurate infrared readings here ). An infrared thermometer must be validated against a “master” thermometer calibrated in the laboratory on a known temperature source. The best way to monitor the emissivity and temperature of a surface, ensuring you get the true reading of an infrared thermometer, is to use a solid black body. This minimizes most external factors and prevents the temperature from changing too quickly.

Emissivity

As we saw in our previous blog post On the accuracies and limitations of infrared, emissivity plays a huge role when calibrating IR thermometers.

Depending on what you point your infrared thermometer at, you will get a variation in the infrared energy emitted. Emissivity is a measure of a material's ability to emit infrared energy. It is measured on a scale of approximately 0.01 to 1.00. Generally, the closer a material's emissivity index is to 1.00, the more likely that material is to absorb reflected or ambient infrared energy and emits only its own infrared radiation. Click here to learn more about emissivity.

What equipment is needed to validate the accuracy of an infrared instrument?

At thermometre.fr we are able to provide a traceable calibration certificate on all infrared thermometers.

To verify the accuracy of an infrared thermometer in the field, a thermometer comparator and a high-precision calibrated “master” thermometer, such as a reference thermometer , are necessary. The thermometer comparator consists of an aluminum cup with a solid matte black base. The base incorporates two holes to take the internal temperature of the base using a "master" thermometer. An infrared thermometer can then be held over the mouth of the cup to take the temperature of the surface of the base.

How to validate a temperature on an infrared instrument?

Make sure the comparator and infrared thermometer are clean and free of any debris or substances that could affect the reading (read our complete guide on cleaning and storing your IR device here ).

Place the thermometer comparator on a flat surface.

Insert the reference thermometer probe into one of the test holes in the base and allow it to stabilize. This can take any length of time, depending on the response time of the probe inserted.

If the infrared device has adjustable emissivity, make sure it is set to 0.95, the correct setting for the matte black surface of the thermometer comparator.

Point the thermometer at the bottom of the comparator and take a reading. The instrument should read within 1°C of the reference thermometer at an ambient ambient temperature of 22°C, depending on the accuracy of the thermometer.

At what temperature can an infrared instrument be validated?

The accuracy of an infrared thermometer can be checked using a comparator at any stable temperature. However, to reduce the possibility of a temperature difference between the interior surface and the base test hole, it is most accurate to 22°C, ambient room temperature.

Thermal stability

Using an infrared thermometer in hot or cold temperatures will increase the possibility of thermal instability.

For every 1°C of environment above or below 22°C (ambient temperature), an adjustment factor must be added to the accuracy of the instrument to account for thermal instability. This is typically 0.05°C for RayTemp thermometers. Other infrared thermometers may have a different value. Here is a table showing the values ​​to consider when using a RayTemp 2 thermometer in cold or hot environments.

Tableau valeurs

*accuracy and thermal stability of other instruments may vary.

 

Do's and don'ts

Calibrate at an ambient temperature of around 22°C if possible.

Yes Do not modify the temperature surrounding the comparator before validation or the surface temperature may differ from the internal temperature.

Be aware of external factors that influence taking a correct IR reading of the comparator, such as humidity, frost and debris.

Yes Do not position the infrared thermometer too far away, or at an angle, when taking the temperature from the comparator as this may provide an inaccurate reading.

Perform measurements as quickly as possible to prevent the surface temperature from changing.

do not forget that thermometers need time to acclimatize to a different environment.

Leave a comment

Please note that comments must be approved before being published.