Dopesick Read online

Page 3


  In the 1820s, one of Boston’s leading merchants masterminded an opium-smuggling operation off the Cantonese coast, spawning millions for Boston Brahmins with the names of Cabot, Delano (as in FDR), and Forbes. This money would go on to build many of the nation’s first railroads, mines, and factories.

  Around that time, a twenty-one-year-old German apothecary urged caution when he published the first major opium breakthrough. Friedrich Sertürner had isolated the active ingredient inside the poppy, an alkaloid he named morphium after the Greek god of dreams, Morpheus. Sertürner quickly understood that morphine was exponentially more powerful than processed opium, noting that its side effects often progressed from euphoria to depression and nausea. He had not at all liked what the compound did to his dogs: It made them pass out drooling, only to awaken in an edgy and aggressive state, with fevers and diarrhea—the same state of withdrawal the opium-addicted in China had long referred to as “yen.” (What modern-day addicted users call dopesick or fiending, William S. Burroughs referred to as junk sick, gaping, or yenning.) “I consider it my duty to attract attention to the terrible effects of this new substance in order that calamity may be averted,” Sertürner wrote, prophetically, in 1810.

  But his medical descendants were not so conscientious. Dr. Alexander Wood, the Scottish inventor of the hypodermic needle, hailed his 1853 creation by swearing that, whereas smoking or swallowing morphine caused addiction, shooting it up would not. No one mentioned Sertürner’s warning decades before. It was easier to be swayed by Wood’s shiny new thing.

  So when doctors departed from the homes of the injured Civil War veterans they were treating, it became standard practice to leave behind both morphine and hypodermic needles, with instructions to use as needed. An estimated hundred thousand veterans became addicted, many identifiable not by shirt smudges of orange and green but by the leather bags they carried, containing needles and morphine tablets, dangling from cords around their neck. The addiction was particularly severe among white Southerners in small cities and towns, where heartbroken wives, fathers, and mothers turned to drugs to cope with devastating war fatalities and the economic uncertainty brought on by slavery’s end.

  “Since the close of the war, men once wealthy, but impoverished by the rebellion, have taken to eating and drinking opium to drown their sorrows,” lamented an opium dealer in New York.

  By the 1870s, injecting morphine was so popular among the upper classes in Europe and the United States that doctors used it for a variety of ailments, from menstrual pain to inflammation of the eyes. The almost total lack of regulatory oversight created a kind of Wild West for patent medicines, with morphine and opium pills available at the nearest drugstore counter, no prescription necessary. As long as a doctor initially OK’d the practice, even injected morphine was utterly accepted. Daily users were not socially stigmatized, because reliance on the drug was iatrogenic.

  Morphine did generate public debate, if tepid, from a few alarm-sounding doctors. In 1884, the Virginia General Assembly considered placing regulations on over-the-counter versions of opium and morphine, a move the local newspaper denounced as “class legislation.” In response, Richmond doctor W. G. Rogers wrote an empathetic, impassioned letter urging the newspaper to reconsider its stance:

  I know persons who have been opium-eaters for some years who now daily consume enough of this poison in the form of morphine to kill a half dozen robust men not used to the poison. I have heard them, with tears in their eyes, say that they wished it had never been prescribed for them, and…many of them [have] inserted into the flesh frequently during each day, in spite of the painful abscesses it often causes, until in some instances the whole surface of the body seems to be tattooed. I have heard one exclaim with sorrow that there was no longer a place to put it. Whilst they know it is killing them, more or less rapidly, the fascination and power of the drug [are] irresistible, and it is a rare exception if they ever cease to take it as long as it can be obtained until they have poisoned themselves to death.

  Should not this, then, be prevented, though the profits of [the drug-sellers] be diminished?

  The legislature declined to approve the bill, considering it government overreach, which allowed the tentacles of morphinism to dig in deeper. Fourteen years later, Bayer chemist Heinrich Dreser stumbled on a treasure in the pharmaceutical archives: the work of a British chemist who in 1874 had made a little-remarked-on discovery while researching nonaddictive alternatives for morphine.

  Diacetylmorphine—aka heroin—was more than twice as powerful as morphine, which was already ten times stronger than opium. At a time when pneumonia and tuberculosis were the leading causes of death and antibiotics didn’t yet exist, Dreser believed he had unearthed the recipe for an elixir that would suppress coughing as effectively as codeine, an opium derivative, but without codeine’s well-known addictive qualities.

  He ordered one of his lab assistants to synthesize the drug. From its first clinical testing in 1897—initially on rabbits and frogs, then on himself and employees of the Bayer dye factory—Dreser understood that the new drug’s commercial potential was huge.

  If they could pitch heroin as a new and nonaddictive substitute for morphine, Dreser and Bayer would both strike it rich. Presenting the drug to the German medical academy the following year, Dreser praised heroin’s sedative and respiration-depressing effects in treating asthma, bronchitis, and tuberculosis. It was a safe family drug, he explained, suitable for baby colic, colds, influenza, joint pain, and other ailments. It not only helped clear a cough, it also seemed to strengthen respiration—and it was a sure cure, Bayer claimed, for alcoholism and morphine abuse.

  Bayer’s company doctor chimed in, assuring his fellow physicians: “I have treated many patients for weeks with heroin, without one observation that it may lead to dependency.” Free samples were mailed to American and European physicians by the thousands, along with testimonials that “addiction can scarce be possible.”

  By 1899, Bayer was cranking out a ton of heroin a year and selling it in twenty-three countries. In the United States, cough drops and even baby-soothing syrups were laced with heroin, ballyhooed at a time when typical opioid consumers were by now not only war veterans but also middle-aged barbers and teachers, shopkeepers and housewives. Many were mostly functioning, doctor-approved users, able to hide their habits—as long as their supply remained steady, and as long as they didn’t overdo.

  At the dawn of the twentieth century, the pendulum began to swing the other way as a few prominent doctors started to call out their overprescribing peers. Addressing the New York Academy of Medicine in 1895, a Brooklyn doctor warned colleagues that leaving morphine and syringes behind with patients, with instructions to use whenever they felt pain, was “almost criminal,” given that some were becoming hooked after only three or four doses. “Many cases of the morphine habit could have been avoided had the family physician not given the drug in the first place,” he said. By 1900, more than 250,000 Americans were addicted to opium-derived painkillers.

  And yet heroin’s earliest years were mostly full of praise, as medical journals heralded Bayer’s new cough suppressant, considering it distinctly superior to and apart from morphine, some promoting it as a morphine-replacement drug. Though a few researchers warned about possible addiction—“the toxic properties of the drug are not thoroughly known,” one noted in 1900—for eight years you could buy heroin at any American drugstore or by mail order.

  In 1906, the American Medical Association finally sounded a sterner alarm: “The habit is readily formed and leads to the most deplorable results.” Heroin-related admissions to hospitals in New York and Philadelphia were rising by the 1910s and 1920s, and it was dawning on officials that addiction was skyrocketing among both the injured and recreational users (then called “vicious,” meaning their use rose from the world of vice). Soldier’s disease, in the words of New York City’s commissioner of health, had now become “the American Disease.”

&n
bsp; The Harrison Narcotics Act of 1914 severely restricted the sale and possession of heroin and other narcotic drugs, and by 1924 the manufacture of heroin was outlawed, twenty-six years after Bayer’s pill came to market. By the thirties, typical heroin users were working-class, and many of them were children of immigrants, along with a growing number of jazz musicians and other creative types, all now reliant on criminal drug networks to feed their vicious habit—and keep their dopesickness at bay. The addicted were now termed “junkies,” inner-city users who supported their habit by collecting and selling scrap metal. The “respectable” upper- and middle-class opium and morphine addicts having died out, the remaining addicted were reclassified as criminals, not patients.

  Gone and buried were the doctor-addicted opioid users once common, especially in small towns—think of Harper Lee’s morphine-addicted eccentric, Mrs. Dubose, from To Kill a Mockingbird, or the morphine-addicted mother who inspired Eugene O’Neill’s Long Day’s Journey into Night. Think of the “Des Moines woman [who] gave her husband morphine to cure him of chewing tobacco,” as one newspaper chortled. “It cured him, but she is doing her own spring planting.”

  Think of the time in 1914, decades before the term “neonatal abstinence syndrome” was coined (to describe the withdrawal of a baby born drug-dependent), when a Washington official wrote that it was “almost unbelievable that anyone for the sake of a few dollars would concoct for infant use a pernicious mixture containing…morphine, codeine, opium, cannabis indica, and heroin, which are widely advertised and which are accompanied by the assertion that they ‘contain nothing injurious to the youngest babe.’”

  Below the story, on the same newspaper page, appeared an ad for an opium “sanitorium,” a sprawling Victorian home in Richmond in which Dr. H. L. Devine promised that he could cure opium addiction in ten days to three weeks.

  But the yellowed newspaper warnings would become moot, like so many historical footnotes—destined to repeat themselves as soon as they receded from living memory.

  Despite all the technical, medical, and political sophistication developed over the past century, despite the regulatory initiatives and the so-called War on Drugs, few people batted an eye in the late 1990s as a new wave of opioid addiction crept onto the prescription pads of America’s doctors, then morphed into an all-out epidemic of OxyContin’s chemical cousin: Heinrich Dreser’s drug.

  No one saw the train wreck coming—not the epidemiologists, not the criminologists, not even the scholars who for decades had dissected the historical arc of Papaver somniferum, the opium poppy.

  Like Alexander Wood promoting his syringes, and Dreser with his sleepy frogs, Purdue Pharma’s David Haddox touted OxyContin for all kinds of chronic pain, not just cancer, and claimed it was safe and reliable, with addiction rates of less than 1 percent. Haddox heralded that statistic to the new army of pharmaceutical sales reps Purdue Pharma hired. They fanned out to evangelize to doctors and dentists in all fifty states with this message: Prescribing OxyContin for pain was the moral, responsible, and compassionate thing to do—and not just for dying people with stage-four cancer but also for folks with moderate back injuries, wisdom-tooth surgery, bronchitis, and temporomandibular joint disorder, or TMJ.

  The 1996 introduction of OxyContin coincided with the moment in medical history when doctors, hospitals, and accreditation boards were adopting the notion of pain as “the fifth vital sign,” developing new standards for pain assessment and treatment that gave pain equal status with blood pressure, heart rate, respiratory rate, and temperature. The seismic shift toward thinking of patients as health care consumers was already under way, as patients now rated their health care experiences in formal surveys, Press Ganey the largest among them, and doctors and hospitals alike competed to see who could engender the highest scores, incentivizing nurses and doctors to treat pain liberally or risk losing reimbursements. In 1999, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO), the nonprofit health care and hospital accreditation body, took the idea a step further, approving new mandatory standards for the assessment and treatment of pain.

  The next year, Purdue’s bean counters gushed about the prospects: “This presents Purdue with the opportunity to provide true value-added services as the ‘pain experts’ in this key area,” read the company’s budget plan. “We have an opportunity to be seen as a leader in helping hospitals meet the JCAHO requirements in this area through the development of pain assessment and pain management materials geared to the hospital setting.”

  To underscore such opportunities, the company planned to pass out $300,000 worth of OxyContin-branded scroll pens, $225,000 worth of OxyContin resource binders, and $290,000 worth of “Pain: The Fifth Vital Sign” wall charts and clipboards. With any luck, every nurse and doctor would soon be wandering the hospital halls, their name badges dangling from a Purdue-branded lanyard.

  A 2000 New York Times article reflected the new and widespread view among the vast majority of health care experts that pain had been grossly undertreated for too long. It featured the story of an older woman in a nursing home who’d been left to writhe in pain, given only Tylenol for the relief of her severe osteoporosis and pulmonary disease. The story demonstrated the growing concern that pain was woefully mismanaged due to outdated notions about addiction: “Many health care workers still erroneously believe that adequate pain relief can leave patients addicted to the drugs.”

  But what exactly was adequate pain relief? That point was unaddressed. Nor could anyone define it. No one questioned whether the notion of pain, invisible to the human eye, could actually be measured simply by asking the patient for his or her subjective opinion. Quantifying pain made it easy to standardize procedures, but experts would later concede that it was objective only in appearance—transition labor and a stubbed toe could both measure as a ten, depending on a person’s tolerance. And not only did reliance on pain scales not correlate with improved patient outcomes, it also had the effect of increasing opioid prescribing and opioid abuse.

  “Every single physician I knew at the time was told to be much more serious about making pain a priority,” said Dr. John Burton, the head of emergency medicine for Carilion Clinic, the largest medical provider in western Virginia. “All it did was drive up our opioid prescribing without really understanding the consequences of what we were doing.

  “I can remember telling my residents, ‘A patient can’t get hooked on fourteen days’ worth of [opioid] pills.’ And I was absolutely wrong.”

  The Press Ganey survey upped the pressure, recalled an emergency-room doctor who practiced in St. Louis. “We quickly found that drug-seeking patients or others sending off vibes we didn’t like would give us bad reviews,” remembered Dr. David Davis. “When you’re really busy and interrupted all the time with seriously sick patients, it’s so easy to give them an IV dose of Dilaudid or morphine, and kinda kick the can down the road.

  “I did it myself, though I knew it was not the right thing to do. It was pushed on us big time, the idea that they can’t become addicted if you’re using opioids to treat legitimate pain. The advent of the pain score, we now think, got patients used to the idea that zero pain was the goal, whereas now doctors focus more on function if the pain score is three or four.”

  Compared with the New Zealand hospitals where Davis worked earlier in his career—often prescribing physical therapy, anti-inflammatories, biofeedback, or acupuncture as a first-line measure—American insurance companies in the age of managed care were more likely to cover opioid pills, which were not only cheaper but also considered a much quicker fix.

  Little did Davis or the other ER docs understand that the routine practice of sending patients home with a two-week supply of oxycodone or hydrocodone would culminate by the year 2017 in a financial toll of $1 trillion as measured in lost productivity and increased health care, social services, education, and law enforcement costs.

  Throughout OxyContin’s earliest years, only a few voices of
dissent rose to remind doctors that, historically, there had been risks associated with prescribing narcotics, and even those warnings were timid. Dartmouth medical school substance abuse researcher Dr. Seddon R. Savage argued that addiction risks for pain patients on narcotics tended to increase the longer the patients used the drugs. “It is tempting to dismiss all concerns regarding therapeutic opioid use as irrelevant,” she wrote in a physician journal in 1996. “That would clearly be a mistake.” A colleague argued in the same paper that there simply wasn’t enough good data available to make a case for or against liberal opiate prescribing.

  The first real dissent would come soon, though, in the unlikely form of a country doctor and one thoroughly pissed-off Catholic-nun-turned-drug counselor. Though Dr. Art Van Zee and his colleague Sister Beth Davies would sound the epidemic’s first sentinel alarm from Appalachia, they were greeted with the same indifference as the Richmond doctor who demanded prompt action to curb the rampant use of opioids in 1884, and the inventor of morphine, who strongly urged caution in 1810. Their outsider status disguised both the depth and the relevance of their knowledge.