The thermometer has its origins back in the sixteenth century with Galileo but didn’t become recognisable as a modern instrument until it was revolutionised first by Santorio and then Fahrenheit in later centuries.
The Greeks made simple thermometers in the first century BC but there was no way to quantify heat, with hot and cold still following the Aristotelian tradition of being treated as fundamental qualities of the universe.
Then in 1596, as part of the growing attempt to understand the natural world in scientific terms, Galileo Galilei invented the first thermoscope. A thermometer measures, but a thermoscope indicates; in this case it indicated temperature difference. There wasn’t a scale of difference but simply recorded whether temperature was higher, lower or the same and significantly didn’t allow for the differences to be recorded.
In 1612 Santorio Santorio applied a scale to the thermoscope to make it a thermometer. Even though its accuracy was poor it is still officially the first thermometer and was used to measure air pressure.
By 1654, the thermometer advanced further when the Grand Duke of Tuscany, Ferdinand II filled the thermometer with alcohol (which would expand and contract depending on the temperature) to make the instrument more like the one we know today. But it remained inaccurate compared to the ones we know today.
One of the main problems was that each scientist had his own scale to measure the temperature with no universal measurement system.
That was until 1714 when Gabriel Fahrenheit made the most accurate thermometer to date by using mercury instead of alcohol to measure the temperature. Mercury’s increased level of predictability, combined with improved glassworking techniques, made it a much more accurate and reliable way of measuring using the thermometer.
He also lent his name to a new, more accurate scale of measuring the temperature which became universal and reduced the problems of the numerous scientists using their own scales of measurement. The Fahrenheit scale is still used widely today.
But there was still some arguments about a simple form of measurement which was also accurate and informative.
This took a step forward in 1731 when Rene Antoine Ferchauld de Reamur proposed a scale of measuring temperature where the freezing point of water was 0 degrees and the boiling point was 80 degrees.
But in 1742, Swede, Anders Celsius, proposed a scale of measurement with 0 degrees as the boiling point and 100 degrees as the freezing point. This was revised the following year when Jean Pierre Cristin inverted the Celsius scale to produce the Centigrade scale which was adopted as the standard scale by in international agreement in 1948.
In the middle of the nineteenth century (1848), Sir William Thomson, Kelvin of Largs, Lord Kelvin of Scotland, proposed a new temperature scale with an absolute 0 degrees being the lowest temperature possible where molecular motion stops working. 1 Kelvin degree is equivalent to 1 Celsius degree. This scale is still used on some thermometers today.
Modern thermometers are made in different ways with alcohol or mercury thermometers the most accurate for measuring air temperature. Another way is using an Expansion Thermometer where two different metals expand and contract at different rates. They are fused together and coiled like a spring which expands or recoils at changes in air pressure. A needle is connected to the spring to indicate the air temperature.