I\'m working with an electronic temperature logger that is being affected by hea
ID: 1381063 • Letter: I
Question
I'm working with an electronic temperature logger that is being affected by heat generated internally.
How does one come up with a calibration equation to calculate a more accurate reading of ambient temperature based on what the temperature sensor reads, taking into account its own power consumption?
details:
After a few hours and in equilibrium, the sensor reports values that are actually 1 degree Celsius higher than the ambient room temperature (22C) measured by a calibrated device. The sensor is accurate to 0.1 degree C at reporting the temperature of the device itself (which due to heat generated by the electronics has gotten warmer)
The device consumes ~0.1 watts of power, weighs about 200g and has an average specific heat capacity of 1.0 j/g (weighted mix of glass, abs, fr-4, copper). Dimensions are 1"x 3"x 4".
What I've got so far is this heating calculation: 200g * 1c * 1.0j/g / 0.1w / 60s = ~33 minutes to heat up 1 degree.
I'm assuming what we need is to figure out Sensor value - Heat-generated + Heat-dissipated to arrive at actual temperature. Which will require measure the K in newton's law? then what?
I'd really appreciate you help here.
Explanation / Answer
Typically you would attempt to measure rather than calculate the effect.
Perhaps by having a second, calibrated device with a long probe that provides an independent measurement of the temperature. You do this in situ if possible or in some reasonable test stand (which might be as simple as disposable cooler filled with you working fluid).
The only real alternative is to read the data sheet; either for the whole device (if is a off-the-shelf instrument), or for the particular chip (if it is something that you manufactured to spec).
As a desperate fall-back position you might be able to find a rule-of-thumb for devices in the same class, but those are unlikely to be centrally tabulated. Ask around is my best suggestion.