Issue: May 10, 2009
May 10, 2009
4 min read
Save

Thermometry and the role of fever in medicine

Clinical thermometry is another example of medicine’s transition from art to science.

Issue: May 10, 2009
You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

The connection between body heat — or fever — and disease was made millennia ago, but the clinical utility of body temperature was not fully understood until the 1800s.

Although it would be many years later that physicians would finally understand that a raised body temperature meant increased antibody production in the face of disease, early experiments with temperature readings helped the profession to begin to track the progression of diseases, even if doctors could do little to treat them.

Early beliefs about fever

Many centuries ago, instead of indicating underlying disease of infection, fever was thought to be the disease itself. If you had one, it was bad news.

1962 photo of a baby born with an extra appendage connected to the foot caused by the pregnant mother taking the drug thalidomide
A patent for a “clinical” thermometer, which was designed to be able to be heated to a point higher than marked on its temperature scale without breaking. This would allow the instrument to be heated to a boiling point for purposes of sterilization.

Source: United States Patent and Trademark Office; Google Patent Search

Fever was detected by laying the hand on the patient’s flesh. Inflamed parts of the body were hot to the touch, often indicating infection. This was recognized as early as Hippocrates, who insisted that physicians examine temperature, “recognize its signs and use agents to elevate the temperature when depressed and lower it when raised.” This, combined with an accelerated pulse, was used as an indicator of the success of many early medical treatments, such as bloodletting. When blood was let, the patient cooled and their pulse slowed, indicating to the physician that the fever was subsiding.

However, with the early invention of an open thermometer by Galileo in the late 1500s, efforts began to adapt the invention for use to measure the temperature of the human body.

About the year 1612, Sanctorio Sanctorius invented the first crude version of a thermometer as we think of it today. He described the invention in “De Statica Medicina” in 1614. Sanctorio was probably the first physician to try to begin to draw conclusions about disease based on thermometer readings. His early versions of the instrument were unwieldy and often required a long time to get an accurate reading.

A breakthrough came in 1714 when Gabriel Fahrenheit invented the mercury thermometer. Mercury, he found, expanded and contracted more rapidly than water, allowing physicians to obtain a patient’s temperature faster.

Rise in medicine

The instrument began to gain popularity in medicine with the help of Hermann Boerhaave, or more specifically, his two students Gerard L.B. Van Swieten, founder of the Viennese School of Medicine, and Anton De Haen. De Haen was an instructor at the Vienna Hospital and he integrated the use of the thermometer into his bedside routine. He instructed his students that the thermometer was a much more accurate way to determine fever than the hand and made several observations about thermometry. These included the increase in temperature in the elderly, the difference in perceived temperature of a patient and actual temperature, and the change in temperature as a sign of healing.

Several decades later, Antoine Cesar Becquerel and Gilbert Breschet were the first to determine that 37ºC or 98.6ºF was the body temperature of a healthy adult.

However, the early definitive work on the clinical utility of body temperature was published by Carl Wunderlich in 1868. “Das Verhalten der Eigenwarme in Krankheiten” was the culmination of 15 years of observation of temperatures in hospital wards. Starting in about 1851, Wunderlich began to take patients' temperatures at least twice a day, up to as many as four to six times a day, when a patient had a fever. With this data, taken from approximately 25,000 patients, he was able to define and chart certain traits that tracked the progression of diseases proving that “disease obeyed fixed laws.”

As the popularity of thermometry grew among physicians, a desire also grew to adapt the practice to be more useful to everyday doctoring. At the time, the most popular place for determining temperature was a patient’s armpit. Other areas, such as the groin, rectum, urethra or vagina where considered too intimate and the mouth too germ-riddled. It was not until the end of the 1800s and the recognition of the importance of alcohol and other agents as disinfectants that oral thermometry grew in popularity.

Important steps

The design of the thermometer, from an often foot-long model that took 20 minutes to get a reading to a more portable six-inch model that took five minutes to get a reading, is credited to Thomas Clifford Allbutt in 1866.

The evolution of clinical thermometry in the United States is credited to Edouard Seguin and his son, Edward. In 1866, the junior Seguin and William H. Draper began to use the practice regularly in New York Hospital. Using thermometry Seguin first coined the term “vital signs” for temperature, pulse and respiration. He and Draper charted the progress of fevers together with patient vital signs and distinguished symptoms and signs of diseases such as typhus, typhoid and others. Draper tracked these observations in a chart, which he attached to hospital beds, a trend that would spread to most hospitals in the country within a few decades.

By 1870, Seguin was promoting the use of thermometry in homes as well as hospitals. The instrument was described as a tool for mothers to provide useful information to physicians and to escape the clutches of medical quackery.

However, despite physicians’ ever-increasing knowledge of fever, medicine provided few remedies for it. It was not until 1975 that it was understood by the medical community that fever served a function. The increase in body temperature associated with a fever resulted in an increase in antibody production, which helped to fight infection. In fact, in 1993 the World Health Organization recommended against the use of drugs that reduce fevers in children in developing countries. But this recommendation may be harder to accept in developed countries where medicines are much more readily available and the comfort of the patient is often a high priority.

Since its invention, the thermometer has changed and adapted as science and technology advanced. Although the appearance may have changed, the basic function of the instrument has not. – by Leah Lawrence

For more information:

  • Blumenthal I. J R Soc Med. 1997;90:391-394.
  • Haller JS. West J Med. 1985;142:108-116.
  • Pearce JMS. Q J Med. 2002;95:251-252.