Spreading Slow Ideas
Why do some innovations spread so swiftly and others so slowly? Consider the very different trajectories of surgical anesthesia and antiseptics, both of which were discovered in the nineteenth century. The first public demonstration of anesthesia was in 1846. The Boston surgeon Henry Jacob Bigelow was approached by a local dentist named William Morton, who insisted that he had found a gas that could render patients insensible to the pain of surgery. That was a dramatic claim. In those days, even a minor tooth extraction was excruciating. Without effective pain control, surgeons learned to work with slashing speed. Attendants pinned patients down as they screamed and thrashed, until they fainted from the agony. Nothing ever tried had made much difference. Nonetheless, Bigelow agreed to let Morton demonstrate his claim.
We yearn for frictionless, technological solutions. But people talking to people is still the way that norms and standards change.Illustration by Harry Campbell
On October 16, 1846, at Massachusetts General Hospital, Morton administered his gas through an inhaler in the mouth of a young man undergoing the excision of a tumor in his jaw. The patient only muttered to himself in a semi-conscious state during the procedure. The following day, the gas left a woman, undergoing surgery to cut a large tumor from her upper arm, completely silent and motionless. When she woke, she said she had experienced nothing at all.
Four weeks later, on November 18th, Bigelow published his report on the discovery of “insensibility produced by inhalation” in the Boston Medical and Surgical Journal. Morton would not divulge the composition of the gas, which he called Letheon, because he had applied for a patent. But Bigelow reported that he smelled ether in it (ether was used as an ingredient in certain medical preparations), and that seems to have been enough. The idea spread like a contagion, travelling through letters, meetings, and periodicals. By mid-December, surgeons were administering ether to patients in Paris and London. By February, anesthesia had been used in almost all the capitals of Europe, and by June in most regions of the world.
There were forces of resistance, to be sure. Some people criticized anesthesia as a “needless luxury”; clergymen deplored its use to reduce pain during childbirth as a frustration of the Almighty’s designs. James Miller, a nineteenth-century Scottish surgeon who chronicled the advent of anesthesia, observed the opposition of elderly surgeons: “They closed their ears, shut their eyes, and folded their hands. . . . They had quite made up their minds that pain was a necessary evil, and must be endured.” Yet soon even the obstructors, “with a run, mounted behind—hurrahing and shouting with the best.” Within seven years, virtually every hospital in America and Britain had adopted the new discovery.
Sepsis—infection—was the other great scourge of surgery. It was the single biggest killer of surgical patients, claiming as many as half of those who underwent major operations, such as a repair of an open fracture or the amputation of a limb. Infection was so prevalent that suppuration—the discharge of pus from a surgical wound—was thought to be a necessary part of healing.
In the eighteen-sixties, the Edinburgh surgeon Joseph Lister read a paper by Louis Pasteur laying out his evidence that spoiling and fermentation were the consequence of microorganisms. Lister became convinced that the same process accounted for wound sepsis. Pasteur had observed that, besides filtration and the application of heat, exposure to certain chemicals could eliminate germs. Lister had read about the city of Carlisle’s success in using a small amount of carbolic acid to eliminate the odor of sewage, and reasoned that it was destroying germs. Maybe it could do the same in surgery.
During the next few years, he perfected ways to use carbolic acid for cleansing hands and wounds and destroying any germs that might enter the operating field. The result was strikingly lower rates of sepsis and death. You would have thought that, when he published his observations in a groundbreaking series of reports in The Lancet, in 1867, his antiseptic method would have spread as rapidly as anesthesia.
Far from it. The surgeon J. M. T. Finney recalled that, when he was a trainee at Massachusetts General Hospital two decades later, hand washing was still perfunctory. Surgeons soaked their instruments in carbolic acid, but they continued to operate in black frock coats stiffened with the blood and viscera of previous operations—the badge of a busy practice. Instead of using fresh gauze as sponges, they reused sea sponges without sterilizing them. It was a generation before Lister’s recommendations became routine and the next steps were taken toward the modern standard of asepsis—that is, entirely excluding germs from the surgical field, using heat-sterilized instruments and surgical teams clad in sterile gowns and gloves.
In our era of electronic communications, we’ve come to expect that important innovations will spread quickly. Plenty do: think of in-vitro fertilization, genomics, and communications technologies themselves. But there’s an equally long list of vital innovations that have failed to catch on. The puzzle is why.
“Is there a section at the bottom for comments?”
Did the spread of anesthesia and antisepsis differ for economic reasons? Actually, the incentives for both ran in the right direction. If painless surgery attracted paying patients, so would a noticeably lower death rate. Besides, live patients were more likely to make good on their surgery bill. Maybe ideas that violate prior beliefs are harder to embrace. To nineteenth-century surgeons, germ theory seemed as illogical as, say, Darwin’s theory that human beings evolved from primates. Then again, so did the idea that you could inhale a gas and enter a pain-free state of suspended animation. Proponents of anesthesia overcame belief by encouraging surgeons to try ether on a patient and witness the results for themselves—to take a test drive. When Lister tried this strategy, however, he made little progress.
The technical complexity might have been part of the difficulty. Giving Lister’s methods “a try” required painstaking attention to detail. Surgeons had to be scrupulous about soaking their hands, their instruments, and even their catgut sutures in antiseptic solution. Lister also set up a device that continuously sprayed a mist of antiseptic over the surgical field.
But anesthesia was no easier. Obtaining ether and constructing the inhaler could be difficult. You had to make sure that the device delivered an adequate dosage, and the mechanism required constant tinkering. Yet most surgeons stuck with it—or else they switched to chloroform, which was found to be an even more powerful anesthetic, but posed its own problems. (An imprecise dosage killed people.) Faced with the complexities, they didn’t give up; instead, they formed an entire new medical specialty—anesthesiology.
So what were the key differences? First, one combatted a visible and immediate problem (pain); the other combatted an invisible problem (germs) whose effects wouldn’t be manifest until well after the operation. Second, although both made life better for patients, only one made life better for doctors. Anesthesia changed surgery from a brutal, time-pressured assault on a shrieking patient to a quiet, considered procedure. Listerism, by contrast, required the operator to work in a shower of carbolic acid. Even low dilutions burned the surgeons’ hands. You can imagine why Lister’s crusade might have been a tough sell.
This has been the pattern of many important but stalled ideas. They attack problems that are big but, to most people, invisible; and making them work can be tedious, if not outright painful. The global destruction wrought by a warming climate, the health damage from our over-sugared modern diet, the economic and social disaster of our trillion dollars in unpaid student debt—these things worsen imperceptibly every day. Meanwhile, the carbolic-acid remedies to them, all requiring individual sacrifice of one kind or another, struggle to get anywhere.
The global problem of death in childbirth is a pressing example. Every year, three hundred thousand mothers and more than six million children die around the time of birth, largely in poorer countries. Most of these deaths are due to events that occur during or shortly after delivery. A mother may hemorrhage. She or her baby may suffer an infection. Many babies can’t take their first breath without assistance, and newborns, especially those born small, have trouble regulating their body temperature after birth. Simple, lifesaving solutions have been known for decades. They just haven’t spread.
Many solutions aren’t ones you can try at home, and that’s part of the problem. Increasingly, however, women around the world are giving birth in hospitals. In India, a government program offers mothers up to fourteen hundred rupees—more than what most Indians live on for a month—when they deliver in a hospital, and now, in many areas, the majority of births are in facilities. Death rates in India have fallen, but they’re still ten times greater than in high-income countries like our own.
Not long ago, I visited a few community hospitals in north India, where just one-third of mothers received the medication recommended to prevent hemorrhage; less than ten per cent of the newborns were given adequate warming; and only four per cent of birth attendants washed their hands for vaginal examination and delivery. In an average childbirth, clinicians followed only about ten of twenty-nine basic recommended practices.
Here we are in the first part of the twenty-first century, and we’re still trying to figure out how to get ideas from the first part of the twentieth century to take root. In the hopes of spreading safer childbirth practices, several colleagues and I have teamed up with the Indian government, the World Health Organization, the Gates Foundation, and Population Services International to create something called the BetterBirth Project. We’re working in Uttar Pradesh, which is among India’s poorest states. One afternoon in January, our team travelled a couple of hours from the state’s capital, Lucknow, with its bleating cars and ramshackle shops, to a rural hospital surrounded by lush farmland and thatched-hut villages. Although the sun was high and the sky was clear, the temperature was near freezing. The hospital was a one-story concrete building painted goldenrod yellow. (Our research agreement required that I keep it unnamed.) The entrance is on a dirt road lined with rows of motorbikes, the primary means of long-distance transportation. If an ambulance or an auto-rickshaw can’t be found, women in labor sit sidesaddle on the back of a bike.
“Oh, not much. Just sitting here sifting through an old scrapbook of past injustices and imagined slights.”
The hospital delivers three thousand newborns a year, a typical volume in India but one that would put it in the top fifth of American hospitals. Yet it had little of the amenities that you’d associate with a modern hospital. I met the physician in charge, a smart and capable internist in his early thirties who had trained in the capital. He was clean-shaven and buzz-cut, with an Argyle sweater, track shoes, and a habitual half smile. He told me, apologetically, that the hospital staff had no ability to do blood tests, to give blood transfusions, or to perform emergency obstetrics procedures such as Cesarean sections. There was no electricity during the day. There was certainly no heating, even though the temperature was barely forty degrees that day, and no air-conditioning, even though summer temperatures routinely reach a hundred degrees. There were two blood-pressure cuffs for the entire facility. The nurse’s office in my neighborhood elementary school was better equipped.
The hospital was severely understaffed, too. The doctor said that half of the staff positions were vacant. To help with child deliveries for a local population of a quarter of a million people, the hospital had two nurses and one obstetrician, who happened to be his wife. The nurses, who had six months of childbirth training, did most of the deliveries, swapping shifts year-round. The obstetrician covered the outpatient clinic, and helped with complicated births whenever she was required, day or night. During holidays or sickness, the two nurses covered for each other, but, if no one was available, laboring women were either sent to another hospital, miles away, or an untrained assistant might be forced to step in.