Tale of the Thermometer
The
ability to measure temperature became important in many industries in the
1800’s. But how could one measure temperature? Scientists still doesn’t
understand what caused heat, and the idea that temperature was a measure of the
speed with which atoms move was not universally accepted.
Early
systems tapped the thermal expansion of liquids stored in narrow tubes as a way
to measure temperature, writes Jeremy Webb in his book, Nothing: Surprising Insights Everywhere from Zero to Oblivion. The
level of the liquid was marked at two “fixed temperatures”, such as the melting
and freezing temperatures of water. And everything in between was marked as
intermediate values.
But
this created a catch-22!
“The scale-marking
process assumes that the liquid expands an equal amount for every unit rise in
temperature. But this assumption cannot be verified unless one measures the
thermal expansion of the liquid, and to do that one requires ... a
thermometer.”
It was
in the 1840’s that a French scientist, Henri Victor Regnault, found a way out.
He showed that an 'air thermometer' -- which measures changes in pressure
of dry air in a sealed container -- was superior to both in its reproducibility
and inter-comparability. How did he prove that?
“Different designs of air thermometer
calibrated at the freezing and boiling points of water gave consistent
estimates of temperatures.”
So yes,
the air thermometer was a “true” measure of temperature. But it was hard to
use. So it became the way to instead calibrate other more practical
thermometers instead.
Ironically,
people could now measure temperature without still knowing what exactly they
were measuring!
Comments
Post a Comment