Length
Why Do We Have 12 Inches in a Foot?
May 11, 2026
Twelve is an unusual choice for a base. We have ten fingers, which is why humans invented the decimal system. Yet the foot has 12 inches, not 10. A dozen has 12, not 10. Hours have 60 minutes (5 × 12), not 100. Where did 12 come from?
The Anatomy of a Foot
The most direct answer to why a foot has 12 inches is almost comically literal: count the joints in your four fingers (excluding the thumb). You have three joints per finger, four fingers — 12 joint segments in all. Ancient traders used the thumb to count off segments on their fingers, reaching 12 before needing to reset. This made 12 a natural unit for counting and dividing.
But the story of the inch and foot is older and more complex than finger-counting alone.
The Roman Inheritance
The Latin word uncia (one-twelfth) gives us both "inch" and "ounce." In the Roman system, the foot (pes) was divided into 12 unciae. The Roman foot was approximately 296 mm — close to but not identical to the modern international foot of 304.8 mm.
When Rome expanded across Europe, Roman units embedded themselves into local measurement traditions. The foot and its 12-inch subdivision survived the fall of the empire.
Why 12 Is Better Than 10
Here's the practical case for 12: it divides more evenly than 10.
10 divides evenly into halves (5) and fifths (2). That's it, without getting into decimals.
12 divides evenly into halves (6), thirds (4), quarters (3), sixths (2), and twelfths (1). For a world before decimal arithmetic was widespread, this was enormously useful. A foot could be cleanly divided into half a foot (6 inches), a third of a foot (4 inches), or a quarter (3 inches) without fractions.
For carpenters, coopers, and traders working with physical materials, this meant simpler and more accurate splitting. Half a foot is 6 inches, not 5.0 inches. A third is 4 inches. No decimal required.
The English Foot's Long Journey
The inch was sometimes defined as the length of three barleycorns placed end to end — a standard mentioned in English statutes as far back as the 14th century. The grain (one-third of an inch) and the barleycorn appear throughout historical measurement systems.
Different regional "feet" existed across Europe and varied from about 28 cm to 34 cm. The English foot gradually drifted to its current size through legal statute and trade standardization.
The modern definition — exactly 0.3048 meters — was set in 1959 by the International Yard and Pound Agreement between the US, UK, Canada, Australia, New Zealand, and South Africa. The "foot" became a fixed metric multiple, the 12-inch division preserved by tradition rather than necessity.
A System Optimized for Pre-Decimal Arithmetic
The imperial system's seemingly strange numbers — 12 inches per foot, 3 feet per yard, 5280 feet per mile — were not arbitrary. They were carefully chosen to minimize fractions in a world where division was done by hand.
The metric system, designed in the age of pencil and paper with decimal arithmetic, chose 10 and powers of 10. But for a civilization built on physical craftsmanship before calculators, 12 was the smarter choice.
Both systems solved the same problem for their era. One is now universal; the other persists in the places that built their world around it.