Temperature
The Story of Fahrenheit (and Why Only the US Kept It)
May 11, 2026
When Daniel Gabriel Fahrenheit published his thermometric scale in 1724, it was a genuine scientific achievement. His mercury thermometers were the most accurate in the world, and his scale — whatever its quirks — spread across Europe and its colonies. Today, his name lives on in one country: the United States.
The Inventor
Fahrenheit was born in Gdańsk (then Danzig) in 1686 into a prosperous merchant family. After his parents died of mushroom poisoning on the same day in 1701, he apprenticed as a merchant but quickly gravitated toward science. He spent much of his adult life in the Netherlands, where he manufactured meteorological instruments.
His breakthrough was the mercury thermometer. Previous thermometers used alcohol or water — substances that froze in winter or behaved inconsistently. Mercury's high boiling point, low freezing point, and uniform thermal expansion made it ideal. Fahrenheit's thermometers were so reliable that the Royal Society of London elected him a Fellow in 1724.
The Scale
Fahrenheit needed calibration points — fixed temperatures where he could mark his scale. He chose three:
- 0°F: The temperature of a brine solution (water, ice, and ammonium chloride). This was one of the coldest things he could reliably produce in a lab.
- 32°F: The freezing point of pure water.
- 96°F: Approximately human body temperature (originally calibrated to his wife's armpit temperature, if the accounts are accurate).
The 32-96 gap has 64 degrees — a power of 2, which made subdividing the scale convenient before decimal arithmetic was standard.
Later refinements fixed the boiling point of water at 212°F, giving a span of 180 degrees between freezing and boiling. This made the 96°F body temperature slightly off; it was adjusted to 98.6°F when the scale was standardized.
The Celsius Competition
In 1742, Anders Celsius proposed his scale anchored at 0° (water freezes) and 100° (water boils) — decimal, rational, and memorable. The choice was obvious for the emerging metric tradition in France. By the early 19th century, most of continental Europe had adopted Celsius.
Britain continued with Fahrenheit well into the 20th century. The metrication drive of the 1960s and 1970s formally shifted the UK to Celsius for official purposes. The British still say "it's hot, about 30 degrees" and know they mean Celsius — but an older generation grew up with Fahrenheit.
Why the US Kept It
The United States had standardized Fahrenheit by the time Celsius gained dominance. Weather forecasting, medicine, cooking, and everyday conversation were all calibrated to Fahrenheit. Road signs in Fahrenheit (not that roads have temperature signs), thermostats in Fahrenheit, body temperatures in Fahrenheit — the unit was embedded in infrastructure and intuition.
The 1975 Metric Conversion Act encouraged voluntary metrication but provided no mandate. While the scientific community and pharmaceutical industry moved to metric, everyday temperature did not.
There's also a case for Fahrenheit in human-scale precision: its finer granularity (1°F ≈ 0.56°C) means a 70°F day and a 72°F day feel meaningfully different, while 21°C and 22°C round to the same value. For weather, this matters.
A Scale That Almost Survived
The Fahrenheit scale is a historical artifact that's remained useful enough that no country has had sufficient motivation to replace it. It's not precise enough for science, not rational enough for education, but perfectly calibrated for American weather.
The man who made the best thermometers in the world in 1724 would probably find the global proliferation of Celsius sensible and the American devotion to his scale pleasingly stubborn.
Try the Temperature Converter
Convert between units instantly with our free online tool.
Open converter →