Open Heart Read online

Page 11


  In my memory, the polio scare (as in my parents’ phrasing: “There’s a polio scare—let’s just hope it doesn’t lead to an epidemic!”) came regularly each year after school was out in June, and when it did, beaches and swimming pools were closed, I was warned to stay out of crowds, to wash my hands frequently, to be careful about the water I drank, and never to swallow water from a lake or stream. This happened both in Brooklyn and in upstate New York, where my mother, brother, and I either went to sleepaway summer camps (my mother, in exchange for tuition for me and Robert, worked as camp nurse) or rented rooms in a large old farmhouse where there was a communal kitchen, and where my mother’s brother and four sisters, with their children, also rented rooms.

  I recall, too, an earlier time—I was perhaps nine or ten years old—when, in my elementary school, P.S. 246, we lined up in the auditorium, the boys in white shirts and red ties, the girls in white blouses and dark skirts, to be given vaccinations. Class by class, we filed down to the front of the auditorium, and one by one we rolled up our sleeves, stepped forward, and received our shots. This occurred a few years after World War II, and we were told that by receiving these injections without complaint or tears we were heroes too: young Americans mobilizing against treacherous enemies—disease, disability, and epidemic—in order to keep our nation healthy, strong, and free.

  I remember watching my friend Ronald Granberg—a tall, broad-shouldered, red-headed boy chosen to lead the Color Guard and carry the American flag down the center aisle at the start of assembly each Friday morning—get his shot, take a drink from the water fountain to the right of the stage, and faint straightaway into the spigot, chipping a sizeable triangle from his right front tooth. Twenty years later, our teacher, Mrs. Demetri (who lived around the corner from me, and gave me oil painting lessons in her apartment at night), told me she met Ronald in the street one day when he was a grown man, and that they recognized each other immediately. “Open your mouth, Ronald—” Mrs. Demetri told me she commanded him first thing “—and show me your tooth!”

  My friends and I grew up and came of age in a time when, as David Weatherall writes, “it appeared that medical science was capable of almost anything”—in a time when the diseases that throughout our parents’ and grandparents’ lifetimes had been the chief instruments of infant and childhood death, and of crippling lifelong disabilities, were disappearing.

  In these pre-AIDS years, Jerry explains, citing the success, among other things, of the worldwide program to eliminate smallpox, the medical community seemed to believe that infectious disease was, by and large, a thing of the past.

  “When I was doing my internship, I was one of the few young doctors choosing to go into infectious disease,” he says. “I did it because I wanted to work in areas of our country and the Third World—poor areas—where there was still work to do that might make a real difference, and where these diseases were still taking an enormous toll. For the most part, however, I was anomalous in my choice of specialty. Back then, infectious disease was certainly not considered a promising specialty for medical students and young doctors, either clinically or in terms of research.”

  Optimism about the “conquest” of disease—not only the infectious diseases, but all diseases—was widespread. The surgeon general of the United States, William H. Stewart, was frequently quoted as having declared, in 1967, that “it was time to close the book on infectious disease,” and the sentiment was, Jerry confirms, widely accepted as a truism (even though the surgeon general, it turns out, never said it!) and has continued, despite the AIDS pandemic and the emergence—and reemergence—of other infectious diseases, to prevail.*

  In a more recent example, we have Dr. William B. Schwartz, in Life Without Disease: The Pursuit of Medical Utopia (1998), asserting that if “developments in research maintain their current pace, it seems likely that a combination of improved attention to dietary and environmental factors along with advances in gene therapy and protein-targeted drugs will have virtually eliminated most major classes of disease” (italics added).* More: a molecular understanding of the process of aging, he predicts, may lead to ways of controlling the process so that “by 2050, aging may in fact prove to be simply another disease to be treated.”

  “The virtual disappearance overnight of scourges like smallpox, diphtheria, poliomyelitis, and other infectious killers, at least from the more advanced countries,” Weatherall writes about the post-World War II period, “led to the expectation that spectacular progress of this kind would continue.”*

  “But this did not happen,” he explains. “The diseases that took their place—heart attacks, strokes, cancer, rheumatism, and psychiatric disorders—turned out to be much more intractable.”

  The more we were able to eliminate many of the infectious diseases that led to premature death, that is, the more chronic and degenerative diseases such as cancer and heart disease replaced them as our leading causes of sickness and death. In the 1880 federal census, for example, neither cancer nor heart disease—our major killers a hundred years later—was listed among the ten leading causes of death.

  Throughout the nineteenth century, gastrointestinal diseases, especially among infants and children (manifested largely as diarrheal diseases), were the leading causes of death. By the end of the nineteenth century, in large part because of public health and public works projects (clean water, sewage, sanitation), deaths from gastrointestinal diseases had declined, and tuberculosis and respiratory disorders (influenza, pneumonia) emerged as the major causes of death.

  In 1900, neoplasms (cancer) accounted for less than 4 percent of all deaths and ranked sixth as a cause of mortality, while diseases of the heart accounted for slightly more than 6 percent and ranked fourth.* Eleven years later, in 1911—the year of my mother’s birth (she was one of eight children, two of whom died in infancy)—when respiratory diseases and tuberculosis were still the primary causes of death, heart disease and cancer accounted for nearly 17 percent of total mortality.

  From 1911 through 1935, mortality from tuberculosis declined steadily, and influenza and pneumonia became, and remained, the two leading causes of death, taking their highest toll among people forty-five years and older, while the figure for heart disease and cancer, combined, rose to 30.4 percent.

  By 1998, however, cancer and heart disease had replaced pneumonia and influenza as our leading causes of death, diseases of the heart accounting for 31 percent and malignant neoplasms for 23.2 percent. Of the fifteen leading causes of death, only pneumonia and influenza (3.6 percent, combined) now fell directly into the infectious group, and they took their greatest toll largely from individuals afflicted with a variety of other health problems, many of them deriving from what epidemiologists call “insult accumulation”—the long-term effects of organ damage caused by the childhood illnesses these individuals had survived.*

  But we should note that diagnostic categories and criteria are, then as now—especially with respect to heart disease—ever changing. “We didn’t even know what a heart attack was until some time in the early years of the twentieth century,” Rich says. “It hadn’t really been invented yet—not until James Herrick discovered and wrote about it, and it took a while for the medical community to believe him.”

  Until 1912, when Herrick published a five-and-a-half-page paper in the Journal of the American Medical Association, “Clinical Features of Sudden Obstruction of the Coronary Arteries,” the conventional wisdom was that heart attacks were undiagnosable, fatal events that could only be identified on autopsy. Although Herrick did not claim he was discovering anything new, his conclusions represented a paradigm shift—a radically new way of thinking about old problems that called conventional beliefs into question.

  By comparing symptoms of living patients to those who, after death, were found to have had blocked arteries, Herrick demonstrated that coronary artery disease was recognizable in living patients. At the same time, he offered evidence suggesting that a totally blocked major coro
nary artery, as in my case, need not cause death, or even a heart attack. He concluded that heart attacks were most likely caused by blood clots in the coronary arteries, and that some heart attacks were survivable.

  “Unsurprisingly,” Stephen Klaidman writes, “no one believed him.* The old paradigm was not ready to topple. Herrick said that when he delivered the paper, ‘It fell like a dud.’”

  Six years later, in 1918, Herrick provided additional evidence to support his theory, including comparative animal and electrocardiograph tracings that identified the existence of blocked coronary arteries, and this time, Klaidman writes, “the livelier minds in the medical profession finally began to take notice.”

  Although Herrick’s theory remained the conventional wisdom from 1920 to 1960, at which time it began to be questioned, it was not until 1980 that another American physician, Marcus DeWood, using a technique unavailable to Herrick—selective coronary angiography—proved that it was, in fact, blood clots within the coronary arteries, and not the slow accretion of atherosclerotic plaque, that caused most heart attacks. Thus was Herrick’s theory, nearly seventy years after he first proposed it, fully confirmed.

  Thus, too, Rich contends, do we see how slowly and indirectly it is that we often arrive, in medicine, at the knowledge that allows physicians to be useful to their patients.

  “And the most important element in our ability to be useful,” Rich says, “and to continue to test old and new hypotheses, and so discover those things that, as with Herrick, allow us to be increasingly useful, remains what it has been since I began as a medical student: listening.

  “Listening to the patient has been, is, and will continue to be, I believe, the hallmark of medical diagnosis, the most fundamental element in the practice of good medicine. Wasn’t it Osler who said, ‘Listen to the patient—and the patient will give you the diagnosis’? Well, he was right. For it is the careful taking of a history—and the active listening and observing that accompanies this—that enables doctors such as Herrick to see what’s really there and what others, alas, too often do not see.

  “This,” Rich says, “is what I continue to believe is and should be at the true heart of medicine—the time-honored art of medicine—and, alas, it is fast disappearing.”

  In the years before Rich and I were born, and before cancer and heart disease had become our major killers—in the years when infectious and respiratory diseases were still the primary causes of death, and when doctors often had few resources at their disposal other than listening and consoling—the deaths of infants and children were grimly commonplace, and rates of infant and child mortality substantially, grievously higher than they are now.

  In 1900, of the fifteen leading causes of death, infectious diseases accounted for 56 percent of the total.* When total mortality from all causes is taken into account, the three cardiovascular-renal conditions—heart disease, cerebral hemorrhage, and chronic nephritis—came to only 18.4 percent.

  Between 1900 and 1904—the year my father was born—death rates per thousand for white males and females under the age of one were 154.7 and 124.8. (Comparable rates during these years for non-white Americans—mostly blacks—were more than twice as high.) The mortality rates for white males and females between the ages of one and four during these same years were 17.2 and 15.9, and for nonwhites 40.3 and 30.6.* However, by 1940—two years after I was born—the infant mortality rate had fallen by nearly 75 percent, while in the one-to-four-year age group, the figures had fallen even more dramatically (to 3.1 per thousand for males and to 2.7 for females).* Moreover, infectious disease had become a minor cause of mortality.* Whereas mortality rates for measles, whooping cough, and scarlet fever, for example, were 13.3, 12.2, and 9.6 per hundred thousand in 1900, in 1940 they were, respectively, 0.5, 2.2, and 0.5.

  During the first half of the twentieth century, average life expectancy for Americans rose nearly 50 percent, from 47.3 in 1900 to 68.2 in 1950 (comparable figures for blacks were 33.0 and 60.7). In the second half of the century, figures for average life expectancy continued to rise, and infant and child mortality rates continued to decline, but they did so to a much lesser extent. From 1950 to 1998, however, life expectancy rose by only slightly more than 10 percent—from 68.2 to 76.5 for the total population, and from 60.7 to 71.1 for blacks, while infant mortality declined from 29.2 in 1950 to 7.2 in 1998. And while, in 1900, more than 3 out of every 100 children died between their first and twentieth birthday, today fewer than 2 in 1,000 do. Moreover, the American Academy of Pediatrics reports, “nearly 85% of this decline took place before World War II, a period when few antibiotics or modern vaccines and medications were available.”* (Note, though, the unexpected finding that, based on 1998 figures, the United States had the slowest rate of improvement in life expectancy of any industrialized nation.)

  Just as Rich catalogues the remarkable advances he has seen in the treatment of heart disease since 1959, when he began his medical studies—the advent of monitors that can detect potentially lethal heart arrhythmias, of the cardiac care unit, of medications that break up clots and prevent atherosclerosis, of pacemakers, ventricular assist devices, electronic defibrillators, and of various new surgical procedures (bypasses, transplants, angioplasties, stenting)—so my other friends list the new means they have at their disposal for treating disease and the symptoms of disease: drugs and regimens that control high blood pressure, effective analgesic medications for the management of rheumatic disorders, remarkable diagnostic aids such as MRIs and CAT-scans, powerful medications that can put diseases such as AIDS, depression, schizophrenia, Huntington’s chorea, multiple sclerosis, and various cancers into short- and long-term remission.

  Not only can we now prolong life in ways that were previously not possible, but we have, especially in the last quarter century, developed effective ways to enhance the day-to-day quality of the lives being prolonged. Twenty years ago, as Rich and Dr. Hashim acknowledge, little could have been done for me. I would most probably have died, or if not, might well have been seriously disabled for the rest of my life.

  But the optimism bred a half century ago by the elimination of many childhood diseases, and by the gains we have made since then, has also, in the practice of medicine, become responsible for dangerous illusions, false hopes, and wasteful policies.

  The belief, for example, that all conditions are amenable to “cure”—the various “wars” against diseases that attempt to persuade us that we can “battle” and “conquer” diseases the way we battle and conquer wartime enemies—by “mobilizing” resources, and “attacking” alien invaders (bacteria, viruses)—tends to distort our medical and human priorities, and to show little insight into how the biological world actually works, and how scientific advances come into being.* It also elevates the seeming science of medicine above the art of medicine both by greatly exaggerating the power of technology (often mistaken for and confused with “science”) to improve and save lives, and by falsely dichotomizing the science of medicine and the art of medicine.

  One effect of this is that we often begin and end by treating patients not as people—individual human beings with unique histories and identities—but as interchangeable humanoid vessels in which various diseases, along with treatments and cures for diseases, will interact in predictable, uniform ways. Such beliefs are championed by drug companies, medical groups, and hospitals in public relations and advertising campaigns that continually deluge the public with claims made no less dubious and misleading by their familiarity and vagueness.

  “Discover the Only Cholesterol Medicine Proven to Do All This,” states a February 12, 2001, full-page ad in the New York Times for Pravachol. There follows a checklist contending that Pravachol will lower “bad” cholesterol, raise “good” cholesterol, “extend life by reducing the risk of a heart attack,” and also reduce the risk of first and second heart attacks, strokes, atherosclerosis, bypass surgery, and angioplasty. At the top of the page, this suggestion: “Clip this ad and bring i
t to your doctor.” (The United States remains the only industrialized nation that allows prescription drugs to be advertised directly to the public.)*

  In widely dispersed print and television ads for Zocor, Dan Reeves, an NFL football coach, confides that “suddenly, lowering my high cholesterol became even more important than football.”* After undergoing emergency bypass surgery, Reeves reports he “had a full recovery, and was even able to coach [his] team in the biggest game of the season four weeks later.” Having learned to “take better care of [himself]” he advises the following: “When diet and exercise are not enough, ZOCOR can help people with high cholesterol and heart disease live a longer life by reducing the risk of a heart attack” (italics added).

  Columbia Presbyterian and New York Weill Cornell Cancer Centers claim, in typically militaristic language, that they have been “at the forefront of the fight against cancer” and are now “working together to defeat this relentless disease.”* In “one of the boldest initiatives ever undertaken,” they offer “new hope that the fight will be won” because at these cancer centers “experts” are helping to “uncover genes that cause cancer—essential to conquering the disease.”

  And America’s Pharmaceutical Companies, the public relations firm that represents the drug industry (“leading the way in the search for cures”), proclaims that “pharmaceutical company researchers are working hard to discover breakthroughs that will help to make many illnesses and diseases a thing of the past and bring more patients new hope for a better tomorrow.”

  Phil is blunt concerning such seemingly unexceptional claims and the false hopes and illusions they inspire, as well as the fact that patients, with increasing frequency, are coming to their doctors and demanding the medications they have read and heard about: mostly what have become known as lifestyle medications (Viagra, Prozac, Paxil, Rogaine) and the statins (Lipitor, Mevacor, Pravachol, Zocor), whose ads repeatedly suggest, in addition to banalities about “new hope,” “new cures,” and “better tomorrows,” what has not been proven: that these drugs will “extend life” and enable us to “live longer.”*