F
ahrenheit is better than Celsius. The proponents of Celsius take the "scientific simplicity" of Celsius to be its defining virtue -- water is the most important substance with which we interact, and it is a solid, liquid, and gas under temperatures that actually occur on Earth; therefore, we should demarcate these phase changes with the nice round numbers of 0 and 100. There's an intuitive plausibility here, but it breaks down under scrutiny.
First of all, "nice, round numbers" don't mean as much in temperature as they do in other aspects of science. You never mulitiply temperatures, or even really add them, so the location of "zero" isn't making certain mathematical operators easier, or anything like that. Realistically, only absolute zero has the best claim to owning zero, but we don't use Kelvin on a day-to-day basis for a reason.
The fact is that with temperature, nice round numbers do mean something to ordinary people, who are looking to describe the world around them. This is where the failing of Celsius is most acute. The vast, vast majority of temperature readings are done in exactly two kinds of situations -- describing ambient air temperature; and cooking temperatures within stoves. In both of these important fronts, Celsius' nice round numbers fall flat. One hundred degrees Celsius is far hotter than ambient air temperatures anywhere humans live, and is colder than the majority of heats used in ovens to actually cook food. There is scarcely a thermometer on earth that ever uses that nice, round 100 on the Celsius side.
Nice round numbers don't exist for scientists (they're used to doing math), they exist for regular people. I contend that the numbers between 0 and 100 should be regularly and efficiently used by ordinary people to aid in communication. But in Celsius, you essentially never use any of the degrees in daily life between 40 (among the hottest of days) and 110 (minimum cooking temperatures on ovens). This is just a waste of perfectly good numbers.
Fahrenheit succeeds across the board here. At any given moment, I would guess that 95% of the habitable places on earth are between 0 and 100 degrees Fahrenheit. The most extremes might go 40 below, or 30 above the range, and these are acceptable precisely because they are extremes. Furthemore, the smaller degree size means that terms like "in the 30s", or "in the 70s" can have real descriptive power, quickly conveying what others want to know within a reasonable band.
Cooking temperatures are high -- they tend to go from 250 to about 500 degrees Fahrenheit, numbers which don't have any particular meaning attached to them, but then, neither do they in Celsius. Celsius adherents have convinced themselves that their system is more "meaningful" because 0 and 100 have concrete, real world definitions. The problem is, these definitions make no one's lives easier, and come off as more of a cute gimmick than a useful system. Zero and one hundred might not have "definitions" in Fahrenheit, but they show a sensitivity to actual use that more scientists and engineers should be appreciative of.
Also this marks the beginning of a new month of my blog. I've officially made it a month of daily posting. Eleven more to go.