Pages

Thursday, April 30, 2015

Too much medical care: do we know it when we see it?

The year after I moved to Washington, DC, I visited an ophthalmologist for a routine vision examination and prescription for new glasses. Since undergoing two surgical procedures to correct a "lazy eye" as a child, I hadn't had any issues with my eyesight. Part of my examination included measurement of intraocular pressures, a test used to screen for glaucoma. Although my work for the U.S. Preventive Services Task Force was in the future, I already understood the lack of evidence to support performing this test in a young adult at low risk. Not wanting to be a difficult patient, though, I went along with it.

My intraocular pressures were completely normal. However, the ophthalmologist saw something else on her examination that she interpreted as a possible early sign of glaucoma, and recommended that I undergo more elaborate testing at a subsequent appointment, which I did a couple of weeks later. The next visit included taking many photographs of my eyes as I tracked objects across a computer screen, as well as additional measurements of my intraocular pressures. These tests weren't painful or very uncomfortable, but they made me anxious. Glaucoma can lead to blindness. Was it possible that I was affected, even though no one in my family had ever been diagnosed with this condition? Fortunately, the second ophthalmologist who reviewed my results reassured me that the tests were normal, and admitted had probably been overkill in the first place. "Dr. X [the first ophthalmologist] is a specialist in glaucoma," he said, by way of explanation. "Sometimes we tend to look a little too hard for the things we've been trained to see." (I appreciated his candor, and he has been my eye doctor ever since.)

I was reminded of this personal medical episode while reading a commentary on low-value medical care in JAMA Internal Medicine by Craig Umscheid, a physician who underwent a brain MRI after questionable findings on a routine vision examination suggested the remote possibility of multiple sclerosis, despite the absence of symptoms. Although Dr. Umscheid recognized that this expensive and anxiety-inducing test was low-value, if not worthless, he went along with it anyway. "Despite my own medical and epidemiologic training," he wrote, "it was difficult to resist his [ophthalmologist's] advice. As my physician, his decision making was important to me. I trusted his instincts and experience."

If physicians such as Dr. Umscheid and I didn't object to receiving what we recognized as too much medical care when we saw it, it should not be a surprise that, according to one study, many inappropriate tests and treatments are being provided more often, not less. 5.7% of men age 75 and older received prostate cancer screening in 2009, compared to 3.5% in 1999. 38% of adults received a complete blood count at a general medical examination in 2009, compared to 22% in 1999. 40% of adults were prescribed an antibiotic for an upper respiratory infection in 2009, compared to 38% in 1999. (If you usually have complete blood counts done at your physicals or swear by the Z-PAK to cure your common cold, we can discuss offline why both of these are bad ideas.)

One of the obstacles to reducing unnecessary medical care (also termed "overuse") is that outside of a limited set of tests and procedures, physicians and policymakers may disagree about when care is going too far. The American Board of Internal Medicine Foundation's Choosing Wisely initiative is a good start, but these lists consist of low-hanging fruit accompanied by caveats such as "low risk," "low clinical suspicion," "non-specific pain." To a clinician who feels for whatever reason that a certain non-recommended test or treatment is needed for his patient, these qualifications amount to get-out-of-jail free cards. It's easy to say that payers should just stop paying for inappropriate and potentially harmful medical care, but as an analysis from the Robert Wood Johnson Foundation explained, this is much easier said than done. If a panel of specialists convened to review the medical care that Dr. Umscheid and I received, would they unanimously deem it to have been too much? I doubt it.

Similarly, although endoscopy for uncomplicated gastroesophageal reflux disease is widely considered to be unnecessary, that didn't stop an experienced health services researcher from undergoing this low-value procedure after a few days of worsening heartburn. Comparing her personal experience to the (superior) decision-making processes that occur in veterinary medicine, Dr. Nancy Kressin wrote in JAMA:

Until patients are educated and emboldened to question the value of further testing, and until human health care clinicians include discussions of value with their diagnostic recommendations, it is hard to foresee how we can make similar progress in human medicine. Patients may be fearful that there is something seriously wrong that needs to be identified as soon as possible, they are often deferential to their clinicians' greater knowledge of the (potentially scary) possibilities, and some patients want to be sure that everything possible is done for them, without recognizing the potential harms of diagnostic tests themselves, the risks of overdiagnosis, or the sometimes limited value in knowing the cause of symptoms in determining the course of therapy.

Regardless of future insurance payment reforms, both doctors and patients will have key roles to play in recognizing when medical care is too much. More widespread uptake of shared decision-making, while hardly a panacea, would call attention to the importance of aligning care with patients' preferences and values and the need for decision aids that illustrate benefits and harms of often-overrated interventions. Changing a medical and popular culture that overvalues screening tests relative to their proven benefits may be more challenging. A survey study published in PLOS One affirmed previous findings that patients are far more enthusiastic and less skeptical about testing and screening than they are about medication, even though the harms of the former are often no less than the latter. I agree with the authors' conclusions:

Efforts to address overuse must involve professional medical associations, hospital systems, payers, and medical schools in modifying fee-for-service payment systems, enabling better coordination of care, and integrating lessons about overuse into training and continuing education. But the preferences of active patients nonetheless merit attention. Both the mistrust of pharmaceuticals and the enthusiasm for testing and screening reflect individuals’ efforts to take care of their health. The challenge is to engage patients in understanding the connection between over-testing and over-treatment, to see both as detrimental to their health, and to actively choose to do less.

**

This post first appeared on Common Sense Family Doctor on January 21, 2013.

Wednesday, April 22, 2015

Should women start having mammograms before age 50?

The best answer to this question, I tell both my patients and loved ones, is: it depends on you.

As the U.S. Preventive Services Task Force affirmed this week in its updated draft recommendations on breast cancer screening, "The decision to start screening mammography in women prior to age 50 years should be an individual one. Women who place a higher value on the potential benefit than the potential harms may choose to begin biennial screening between the ages of 40 and 49 years." The Task Force went on to suggest that women with first-degree relatives who had breast cancer might be more motivated to start screening in their 40s.

What this decision shouldn't depend on is being bullied by one's doctor into getting a mammogram "just to be safe." Screening mammography's benefits and harms are closely balanced, and as two of my mentors in preventive medicine observedsome women might reasonably decide to say no:

Over the years we have learned more about the limited benefits of screening mammography, and also more about the potential harms, including anxiety over false-positive results and overdiagnosis and overtreatment of disease that would not have caused health problems. More and more, the goal for breast cancer screening is not to maximize the number of women who have mammography, but to help women make informed decisions about screening, even if that means that some women decide not to be screened.

Two women at "average risk" for breast cancer might make different decisions after they turn 40, depending on how concerned they are about dying from cancer, being diagnosed with cancer, and their tolerance for harms of screening. One well-informed female science journalist might choose to start being screened. Another female reporter, equally well-informed, might choose to opt out. Neither of these decisions is right or wrong on an individual or population level, regardless of the apocalyptic protests of self-interested radiology groups.

What concerns me is how current quality measurement and pay-for-performance approaches could end up pressuring more doctors to behave like bullies and drive up health care costs. Fee-for-service Medicare already spends about $1 billion each year on mammography; across all payers, about 70% of U.S. women age 40 to 85 years are screened annually at a cost of just under $8 billion. Doctor A is not necessarily a better doctor who deserves higher pay than Doctor B because more of Doctor A's patients get mammograms. In fact, the opposite might easily be true.

A recent study estimated that patients and insurers in the U.S. spend an additional $4 billion annually on working up false-positive mammogram results or treating women with breast cancer overdiagnoses. That's an extraordinary amount to spend for no health benefit, and it could be substantially less if physicians had the time and resources to explain difficult concepts such as overdiagnosis. But that doesn't appear to be where we're headed.

Finally, the notion that has been written into law in nearly half of the states in the U.S. requiring that women with dense breast tissue be notified so that they can get supplemental testing for mammography-invisible cancers is particularly misguided. The USPSTF's review found no proof that breast ultrasound, MRI, or anything else improves screening outcomes in women with dense breasts, and a sizable percentage of women can transition between breast density categories over time.

**

Portions of this post first appeared on the AFP Community Blog.

Thursday, April 16, 2015

What can Rwanda teach the U.S. about primary care?

The relative underinvestment of resources in primary care in the U.S. has a great deal to do with the fact that we spend far more on health services than anywhere else in the world but rank near the back of the pack in key health metrics such as life expectancy, infant mortality, and disability compared to other high-income countries. Although economic inequality, lack of insurance coverage, and shrinking public health budgets are also part of the problem, I'd argue that diverting dollars from redundant multi-million dollar proton beam facilities to provide a patient-centered medical home for every American would have positive effects on population health.

Even though I feel that the U.S. has a lot to learn from other countries about building infrastructure to support high-quality primary care, it was still hard for me to get my head around the premise of a 2013 Atlantic headline: "Rwanda's Historic Health Recovery: What the U.S. Might Learn." Like most Americans who have never traveled there, I suspect, my impressions of Rwanda have been strongly influenced by popular dramatizations of the 1994 genocide such as the movie "Hotel Rwanda" and Immaculee Ilibagiza's memoir Left to Tell. I had a difficult time imagining how any semblance of a functioning health system could have emerged even two decades later, much less a system that would have something to teach the U.S. But a BMJ article by Paul Farmer and colleagues documented impressive gains in Rwandan life expectancy, led by declines in morbidity and mortality from tuberculosis, HIV, and malaria that resulted not only from investments in lifesaving drugs but in preventive and primary care. 93% of Rwandan girls have received the complete HPV vaccine series to prevent cervical cancer, compared to 38% of eligible U.S. girls in 2013.

Here's the thing, though: the foot soldiers in the Rwandan primary care revolution aren't doctors. In fact, there were only 625 practicing physicians in the entire country in 2011. (According to a report published in the same year, Washington, DC alone has about 3,000.) How, then, has Rwanda been able to staff its network of community health cancers and reach out to its eleven million people, many of whom are so poor that they can't afford the national health insurance premium of $2 per person? They do it primarily by relying on community health workers, trusted local residents who receive a minimum of basic medical training and are then integrated into more comprehensive primary care teams. As described further in a BMC Health Services Research article by the group Partners in Health:

Each district is served by a network of community health workers (CHWs) — three per village — offering health education, basic preventive and curative services, and family planning. CHWs are supported by local health centers, which serve approximately 20,000 people and are staffed by nurses, most of whom have a secondary school education level. Health centers provide vaccinations, reproductive and child health services, acute care, and diagnosis and treatment of HIV, tuberculosis, and malaria. District hospitals, staffed in part by 10-15 generalist physicians, provide more advanced care, including basic surgical services, such as cesarean sections.
Image courtesy of BMC Health Services Research.

The lesson to take home isn't that the U.S. can get away with training fewer primary care physicians than it already does. Indeed, Rwanda has every intention of training more doctors with assistance from other countries, including the U.S. What's important is the pyramidal structure of their health system, with primary care at the base and more specialized care at the apex. If you took the U.S. physician workforce, which consists of about 70% specialists and 30% generalists, and mapped it to a similar structure, it would look more like this:


At the top, you have the super sub-specialists, who are experts on a single narrow spectrum of diseases confined to one organ system (e.g., hepatologists). Lower down are the ordinary specialists, such as gastroenterologists, cardiologists, and pulmonologists, whose expertise is limited to a single organ system and age group (e.g., adults). Still lower are generalists whose scope of practice is limited by age group. Finally, at the bottom, are the family physicians, the only type of physician whose scope is not limited by age, gender, or organ system.

The problem with this upside-down pyramid is that it's inherently unstable. In Washington, DC, it's easier for a patient with musculoskeletal low back pain to get an appointment with a spine surgeon or for a patient with panic attacks see a cardiologist than it is to find a family physician. You can get a same-day MRI for any number of problems that don't require imaging at all. Such a health system is inefficient and wasteful at best, harmful at worst, and destined to get the extremely poor results it does. To improve population health in the U.S., we need to flip the pyramid so that primary care services are the base for all other health care structures.

**

This post first appeared on Common Sense Family Doctor on December 2, 2013.

Tuesday, April 7, 2015

Palliative care makes a difference at all ages

A troubling study published earlier this year in the Annals of Internal Medicine reported that patients and their caregivers were more likely to report pain, depression, and periodic confusion during the last year of life in 2010 than in 1998. This worsening trend occurred despite increasingly frequent calls to improve end-of-life care communication and the interim publication of practice guidelines on palliative care for adults from the National Consensus Project for Quality Palliative Care and the American College of Physicians. Although the reasons for underutilization of palliative care are not entirely clear, persistent misconceptions about these services being the equivalent of "giving up" on patients or hastening their death likely play a role.

On the other end of the age spectrum, a Close-ups in the April 1st issue of American Family Physician highlighted the benefits of providing palliative care at the beginning of life: in this case, to a baby with trisomy 13 diagnosed by prenatal genetic testing. The devastated parents testified to the importance of their family physician providing support and guidance throughout the pregnancy and after their child's birth:

She helped us understand the decisions we had to make and helped us express our goals for the care of our unborn daughter. We wanted our daughter to have a comfortable life - for however long she lived - and a natural death. At the same time, we wanted as few medical interventions as possible to avoid unnecessarily prolonging her death or suffering.

Although less well-studied than palliative care in adults, critically ill neonates and infants who received palliative care consultations spent fewer days in intensive care units, received fewer blood draws and invasive interventions, and received more referrals to chaplains and social services than comparable patients with life-limiting diagnoses. Perinatal hospice programs take perinatal palliative care one step further and provide compassionate, multidisciplinary support for parents from the time of prenatal diagnosis through the remainder of pregnancy and their child's birth and death. Clinicians who provide maternity and/or newborn care and would like to learn more about perinatal hospice can consult the website perinatalhospice.org, which lists contact information for more than 200 programs across the United States and internationally.

**

This post first appeared on the AFP Community Blog.

Thursday, April 2, 2015

The way we provide care near the end of life requires resuscitation

During one of the plenary sessions at the Lown Institute's Road to Right Care conference in March, a speaker recounted how overdiagnosis and overtreatment ruined her father's last year of life. Diagnosed with symptomatic multi-vessel coronary artery disease but otherwise in good health and independent at age 85, her father underwent successful coronary artery bypass surgery. His postoperative course was happily uneventful, except for a single stool sample that was positive for blood.

Why her father's stool was tested at all was unclear, since his blood counts were normal during his hospitalization. But his heart surgeon nonetheless strongly recommended that he see a gastroenterologist and undergo a colonoscopy. She tried to dissuade her father from doing this, since at his age even an advanced colorectal cancer would be unlikely to progress enough to cause symptoms before he died a natural death, most likely of heart disease. But her father was accustomed to following doctors' orders, so he dutifully underwent the colonoscopy, which showed a single precancerous lesion.

That should have been the end of the story, but since the gastroenterologist was unable to completely remove the lesion, he recommended consulting a surgeon to operate and take out the entire affected section of large intestine. Her father was barely four weeks out from heart surgery, and she again advised him to disregard this spectacularly ill-advised plan. But he wanted to get everything taken care of, to get this all behind him, so he consented to going under the knife again. This time, the postoperative course did not go well. He developed profuse and unremitting diarrhea, most likely from an antibiotic given prior to the surgery. His doctor pronounced the surgery a success - the cancer was cut out, after all - and expressed little interest in dealing with diarrhea. Her father was then transferred to a nursing home, where his diarrhea continued to resist all treatment, and where he died, miserable beyond all imagining, 6 months later.

Conference organizer and Lown Institute senior vice president Shannon Brownlee told another sad end-of-life story about her own father in a book review in the current issue of Washington Monthly. The article's subtitle about says it all: "How Medicare and other federal subsidies rope the elderly into painful, futile, and costly end-of-life care." Despite her father's expressed wish to never go to "the big hospital in Portland" again, he not only ended up there anyway, but underwent a totally unnecessary nuclear stress test and was hooked up to intravenous nutrition before his hospitalist could be persuaded to call in the palliative care team. Brownlee minces no words in describing the deficiences of what passes for end-of-life care in America:

When a frail, elderly person gets sick, takes a fall, or has trouble breathing, it’s as if they have stepped onto a slippery chute leading straight into the hospital, no matter how fervently they and their families might wish to avoid invasive treatment as they age and approach death. That’s because hospital services are what our medical industrial complex has been built to offer, and delivering invasive end-of-life care is the job for which we have trained our doctors and nurses. ... What we don’t do is train clinicians to talk to patients, and what we don’t have is the community-based infrastructure for delivering “high touch” care to people where they live.


I've written before about my belief that the future of medicine is low-tech and high-touch, and I agree with Brownlee that changing Medicare regulations that value ineffective "technology-rich, hospital-centric" interventions rather than house calls and social services to help elderly persons age in place are a necessary first step in resuscitating the way we provide care near the end of life. It's equally critical that we change the mindsets of physicians who see their roles as sustaining life at all costs ("doing everything," in classic medical parlance) even when they are only prolonging death. Internal medicine resident Aaron Stupple recently made a highly sensible proposal in an editorial in BMJ: pair advanced cardiovascular life support (ACLS) training with communication training about palliative care:

Coupling ACLS with communication training has several advantages. Firstly, it legitimizes the skill set as an important and valid component of today’s medical practice. ... Secondly, affixing communication training to mandatory ACLS training binds this material to an established curriculum with a good track record of reliability and measurability. ... Thirdly, all clinical disciplines receive ACLS training, so it could be used to teach a common message and an essential skill set.

Alas, Dr. Stupple's proposal makes so much sense that I fear it may be ignored. How long have we been trying to change the health care system to protect older patients from harmful interventions near the end of life? I remember reading the late surgeon Sherwin Nuland's How We Die in college and being shocked that most of us will die in hospitals, receiving "heroic" interventions that we don't want and won't do us a bit of good anyway. That was more than twenty years ago, and very little has changed. Let's spread the word about the Right Care Movement and dedicate ourselves to making sure I won't be able to write that again twenty years from now.