# 11.1.1 18th Century Temperature Scales

The job of a thermometer is to measure temperature. Technically, a thermometer measures its own temperature. As such, the thermometer must first achieve thermal equilibrium[1] with the object whose temperature we’re trying to measure.

For example, to measure the temperature of a beaker of water using a mercury-in-glass thermometer, we must first allow the mercury to attain thermal equilibrium with the water (by immersing the bulb of the thermometer in the beaker of water). Next we inspect the volume of the mercury. Since mercury expands and contracts measurably with temperature, its volume makes a good thermometric property. All that’s needed now is a scheme to convert the volume to a numerical value representing the temperature. This is where temperature scales come in.

All temperature scales require some reference temperatures for calibration. The freezing and boiling points of water were the obvious choices since they are easily reproducible temperatures. Below are the prominent scales developed in the 18th century.

You may recognize that the centigrade scale looks similar to the Celsius scale that we are using today. However, the centigrade scale is in no way superior to the other scales in this table. Just like all the other scales of that age, it is an empirical scale, based on arbitrarily chosen thermometric properties of water, namely the freezing and boiling points of water[2]. It is also a relative scale. If you think about it, 0°C does not represent the minimum temperature[3], and 40°C is not twice as hot as 20°C.

Interesting

Melting Ice and Galileo Thermometer

[1] “Thermal equilibrium” basically means “same temperature”.

[2] You will soon learn about the thermodynamic scale, which is based on theoretical principles of thermodynamics.

[3] This is in contrast to the absolute scale, where the absolute zero corresponds to the coldest possible temperature.