<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=2111779968872845&amp;ev=PageView&amp;noscript=1">

Filter Posts by Category

How Do I Interpret Chamber Set Point Tolerances in Standards?

Posted by Bill Tobin on Nov 14, 2022 9:47:00 AM

Weathering and corrosion test standards usually specify chamber set point conditions with a plus/minus value. For example, ISO 4892-2 contains the following test parameter:

Black Standard Temperature: 65 ± 3°C

BullseyeGraphic_560px-1

What is important to note is that most weathering standards further specify what this requirement actually means. In most cases, this is specified as allowable Operational Fluctuation. This requirement means that the tester’s irradiance, temperature, or humidity control system must be capable of maintaining stable conditions at a specified set point. Typically, each test parameter is measured and controlled by a single sensor at a fixed location in the test chamber. For the above example from ISO 4892-2, once chamber conditions have stabilized, the temperature of this single control set point should not rise to more than 68°C or fall below 62°C at any time.

Unfortunately, these are frequently misinterpreted as uniformity or programming tolerances for parameters like chamber temperature, relative humidity, or irradiance tolerances. There are two important things to keep in mind that will help you from misinterpreting the intent of one of these setpoints:

  1. This does not indicate a uniformity requirement. When uniformity requirements do exist, they are much wider than operational fluctuation requirements.
  2. This does not allow the user to program the machine at any temperature to any value within the tolerance range. The tester should be programmed at the value stated.

Visit the Q-Lab Weathering & Corrosion Blog for more posts like this.

Topics: FAQ, Standards