Today’s battle between science and willful ignorance is not new. Scientists have always had to struggle against entrenched, non-empirical beliefs, as well as resistance to change. One battle that reason eventually won was the war on medical infections. Here’s how it unfolded:
In 1840s Vienna, a young physician named Ignac Semmelweis made a world-changing observation. Aware that childbed (“puerperal”) fever was killing as many as 25 percent of women in the days following childbirth, Semmelweis found it curious that the deaths occurred almost exclusively among women who delivered their infants in hospitals. Those who delivered at home, and those who self-delivered in alleys and streets, rarely contracted the fatal condition.
Further investigation led Semmelweiss to see a link between the standard medical-school curriculum and maternal deaths. Each day, medical students and professors discussed several cadavers, often between patient rounds. Although the concept of germs was as yet undiscovered, “Semmelweis concluded that the transmission of what he called ‘invisible cadaver particles’was the cause of childbed fever,” says Sherwin B. Nuland, MD, in Doctors, the Biography of Medicine. “The transmitting source of the ‘cadaver particles’ was to be found in the hands of the students and attending physicians.”
Fearing the terrible truth
Semmelweis then instituted the simple measure of washing hands in a chlorine solution. His hygienic practices dramatically reduced the number of deaths caused by childbed fever. But his theory met strong resistance from established physicians, who were offended by the abrasive upstart and slow to acknowledge the terrible truth that their own entrenched procedures may have caused so many deaths, writes Nuland. Semmelweis did not publish his results for another 15 years.
In the 1860s, British physician Joseph Lister broadened the battle to include post-surgical infections, which were almost universal and frequently fatal. Lister was an early adopter of Louis Pasteur’s recently postulated germ theory. Learning that a nearby city had eliminated a stench in its sewers by pouring carbolic acid down the drains, he reasoned that the chemical had killed microorganisms like those identified by Pasteur. Applying this conclusion to surgery, he then devised a wound dressing soaked in carbolic acid. Later, he added a sprayer to drench the entire surgical area in a mist of carbolic acid solution.
Soon, Lister began applying it to his hands and instruments. Lister’s first report on this treatment was published in The Lancet in 1867, the year now regarded as the birth date of antisepsis.
“Listerism” changes everything
As “Listerism” gradually gained acceptance, its principles of cleanliness caught on. The surgeon’s traditional black frock coat, elegantly tailored, stained with blood and rarely laundered, began to be covered by a rubber apron designed to protect the coat from disinfectant.
Then in the 1870s and 1880s, surgeons began to move beyond Lister’s antiseptic approach. Through new aseptic procedures, surgeons attempted to exclude infectious bacteria by boiling or heating instruments, sutures, towels and sponges. Increasingly convinced that their own clothing might be a source of infection, some surgeons began to wear loose-fitting gowns over their street clothes. Some, but not all, anesthetists, assistants and spectators also replaced their street coats with special sterilized cotton or linen coats.An elaborate ritual of hand washing became the norm, and in 1893, William Halsted became the first surgeon to wear sterile rubber gloves while performing an operation.
By 1910, surgical caps and gloves had become widely accepted. Few surgeons, however, welcomed the advent of face masks. “Early masks, which became quickly saturated with moisture from breathing and talking, irritated surgeons, especially those with beards,” writes James M. Edmunson, inSurgical Garb. “For these reasons many surgeons preferred to enforce a rule of silence during operations, rather than don a cumbersome and largely ineffective mask.”
Aseptic techniques even influenced operating room décor. By the 1920s, white gowns, starched white linens, and the sparkling white operating room symbolized the modern concept of healing.
No one, however, has yet found a way to completely eliminate hospital-based (“nosocomial”) infections. In the 1950s and 60s, a nationwide pandemic of staphylococcal infections spurred a movement toward systemic infection control in hospitals. As a result, infection control has emerged as a medical specialty, and several national organizations dedicate themselves to standard-setting education and enforcement.
Today, statistics published by the Centers for Disease Control and Prevention reveal that nosocomial infections affect about 1.7 million patients each year and contribute to 99,000 deaths.
Clearly, more than a century after the early concepts of infection control began to unfold, many challenges remain. We’re waiting for the emergence of the Semmelweises and Listers of our times. And let’s hope their new ideas are accepted.
[Adapted, updated and reprinted by permission of the American Hospital Association, who published the original article, by Gloria Shur Bilchik, in 100 Faces of Healthcare.]