Last week was a busy one: I presented at the American Academy of Family Physicians Assembly (on lung cancer) and the Family Medicine Education Consortium Northeast Region Meeting (on prevention politics); participated in a Lown Institute-sponsored meeting on improving community care for patients with chronic and complex health conditions; and attended a ceremony recognizing the accomplishments of this year's Pisacano Leadership Foundation scholars. Meanwhile, the collection of articles that I keep as fodder for this blog continued to grow. Here, in no particular order, are some reflections on recent publications that illustrate how providing too much medicine inflates costs, wastes health care resources, and ultimately harms patients.
Cardiac catheterization can be a life-saving procedure for patients in the throes of a heart attack, but it doesn't help persons without cardiac symptoms or improve outcomes in patients with stable angina. So why is this procedure performed so often on these patients? A study in JAMA Internal Medicine qualitatively analyzed discussions about the benefits and risks of coronary angiography and percutaneous coronary interventions (PCI) between cardiologists and patients with stable coronary artery disease. Cardiologists were far more likely to exaggerate the benefits of angiography and PCI than to accurately state that compared to medical management, PCI can reduce anginal symptoms but does not lower patients' risk of death or myocardial infarction. Another study in the same journal evaluated data from a national registry of 544 hospitals that performed PCI between 2009 and 2013. One in four patients who underwent elective angiography reported no symptoms (range, 1 to 74 percent). In addition, hospitals that did more angiograms on asymptomatic patients were also more likely to perform PCI for inappropriate reasons. See it, stent it?
One of the patient-centered medical home criteria that my clinical practice was recently tagged as being deficient on is screening adults and adolescents for depression: we weren't doing enough of it. Although routine screening is recommended by the U.S. Preventive Services Task Force, others, including the Canadian Task Force, have concluded that the evidence doesn't support this position. How could screening for depression possibly be harmful? By leading to overtreatment of persons with few depressive symptoms. A study in the Journal of the American Board of Family Medicine found support for this hypothesis by analyzing data from a randomized trial of depression interventions at primary care practices in California. They found that administration of the 9-item Patient Health Questionnaire (PHQ-9), a commonly used screening tool, was associated with 3 times the odds of making a depression diagnosis and 4 times the odds of antidepressant prescribing for persons who, based on their low PHQ-9 scores, were extremely unlikely to have major depression or benefit from medication.
Overdiagnosis and overtreatment affect patients of all ages. In a commentary published in Pediatrics, several prominent academic pediatricians highlight a variety of childhood conditions that are commonly overdiagnosed, including attention deficit-hyperactivity disorder, food allergy, hypoxemia in bronchiolitis, and obstructive sleep apnea. For example, a randomized trial published in JAMA found that infants with mild bronchiolitis whose pulse oximetry measurements were artificially inflated by 3 percentage points were substantially less likely to be admitted to the hospital, but otherwise did just as well as infants in the control group. These findings suggest that emergency medicine physicians and pediatricians rely too much on oxygen saturation (not shown to increase risk of serious outcomes at 88 percent or higher) and too little on the overall clinical picture in making admission decisions.
Looking at the end of life, a cross-sectional study in JAMA Internal Medicine found that more than half of nursing home residents with advanced dementia were receiving at least one medication with questionable (or zero) benefit, costing $816 per patient every 90 days, more than one-third of all prescription costs during the period of analysis. The most common inappropriate drugs were those for early Alzheimer's disease and cholesterol-lowering medications. These drugs don't improve the quality of life of patients with advanced dementia, but can make them feel worse through bothersome or serious side effects.
Monday, October 27, 2014
Tuesday, October 21, 2014
Curbing overuse is an important part of quality improvement
In the late 1990s, the urban public hospital where I trained as a medical student had a single CT scanner. To ensure that this precious resource was put to effective use, any physician ordering a non-emergent CT scan was required to personally present the patient's case to the on-call Radiology fellow and explain how the result of the scan would potentially change management. Since my attending surgeons were usually too busy to trudge down to the Radiology suite, they deputized their residents to do so, and most of the time my residents passed this thankless task down to the students. Thus, my classmates and I learned early on the difference between appropriate and inappropriate reasons for ordering CT scans.
Today, the widespread availability of CT scanners has made this sort of explicit rationing uncommon in the U.S. In fact, a 2011 editorial published in American Family Physician reviewed the accumulating evidence that CT scans are highly overused in medical practice, which puts patients at unnecessary risk of radiation-induced cancers and detection of incidental findings that can lead to overdiagnosis and overtreatment. Identifying overuse of CT scans often isn't easy, though. And some might argue that increasing use of CT scans may have the positive effect of improving diagnosis of common symptoms, allowing physicians to institute appropriate management of serious conditions more quickly.
Family physicians Andrew Coco and David O'Gurek investigated this possibility in a research study published in the Journal of the American Board of Family Medicine. They analyzed data on common chest symptom-related emergency department visits from the National Hospital Ambulatory Medical Care Survey from 1997 to 1999 and 2005 to 2007. Unsurprisingly, the proportion of these visits in which a CT scan was performed rose from 2.1% to 11.5% during this time period. However, the proportion of visits that resulted in a clinically significant diagnosis (pulmonary embolism, acute coronary syndrome or MI, heart failure, pneumonia, pleural effusion) actually fell slightly, challenging that notion that increased CT utilization leads to improved detection and treatment of serious health conditions.
In their editorial, Drs. Diana Miglioretti and Rebecca Smith-Bindman recommended that physicians and referring clinicians take several steps to reduce harms from CT scan overuse:
1. Use CT only when it is likely to enhance patient health or change clinical care.
2. When CT is necessary, apply the ALARA (as low as reasonably achievable) principle to radiation doses.
3. Inform patients of CT risks before imaging.
4. Monitor individual exposure over time and provide the information to patients.
These general points can and should be applied to many other medical interventions, including screening tests and treatments. To paraphrase: Never do anything to a patient unless you think it may help. When an intervention is necessary, intervene as little as possible. Always inform patients of the risks of any intervention, and monitor their exposure to its harmful effects over time so that they can choose to opt out later, if desired. Choose wisely!
**
This post first appeared on Common Sense Family Doctor on January 18, 2012.
Today, the widespread availability of CT scanners has made this sort of explicit rationing uncommon in the U.S. In fact, a 2011 editorial published in American Family Physician reviewed the accumulating evidence that CT scans are highly overused in medical practice, which puts patients at unnecessary risk of radiation-induced cancers and detection of incidental findings that can lead to overdiagnosis and overtreatment. Identifying overuse of CT scans often isn't easy, though. And some might argue that increasing use of CT scans may have the positive effect of improving diagnosis of common symptoms, allowing physicians to institute appropriate management of serious conditions more quickly.
Family physicians Andrew Coco and David O'Gurek investigated this possibility in a research study published in the Journal of the American Board of Family Medicine. They analyzed data on common chest symptom-related emergency department visits from the National Hospital Ambulatory Medical Care Survey from 1997 to 1999 and 2005 to 2007. Unsurprisingly, the proportion of these visits in which a CT scan was performed rose from 2.1% to 11.5% during this time period. However, the proportion of visits that resulted in a clinically significant diagnosis (pulmonary embolism, acute coronary syndrome or MI, heart failure, pneumonia, pleural effusion) actually fell slightly, challenging that notion that increased CT utilization leads to improved detection and treatment of serious health conditions.
In their editorial, Drs. Diana Miglioretti and Rebecca Smith-Bindman recommended that physicians and referring clinicians take several steps to reduce harms from CT scan overuse:
1. Use CT only when it is likely to enhance patient health or change clinical care.
2. When CT is necessary, apply the ALARA (as low as reasonably achievable) principle to radiation doses.
3. Inform patients of CT risks before imaging.
4. Monitor individual exposure over time and provide the information to patients.
These general points can and should be applied to many other medical interventions, including screening tests and treatments. To paraphrase: Never do anything to a patient unless you think it may help. When an intervention is necessary, intervene as little as possible. Always inform patients of the risks of any intervention, and monitor their exposure to its harmful effects over time so that they can choose to opt out later, if desired. Choose wisely!
**
This post first appeared on Common Sense Family Doctor on January 18, 2012.
Tuesday, October 14, 2014
Known and unknown: putting Ebola in perspective
At a recent morning huddle, I noticed that the hanging file of emergency protocols at my practice nurse's station held a new folder, labeled "Ebola." That same day, a patient who had returned from West Africa was isolated at a nearby hospital for symptoms consistent with infection with the virus. I had been following news about the Ebola epidemic for months, since its re-emergence in Guinea, rapid spread to neighboring Nigeria and other parts of West Africa, through the critical illness and miraculous recovery of family physician Kent Brantly. But until that day, I hadn't actually confronted the question, "As a family physician, what do I need to know about this?"
Many have pointed out that even though this is by far the largest and most lethal Ebola outbreak in history, it pales in importance next to more common and contagious viruses such as influenza or measles, or emerging infections closer to home, such as the enterovirus respiratory illness that has stricken children in 46 states. Family physician blogger Mike Sevilla expressed skepticism that patients who continue to decline influenza vaccines in droves would be willing to receive a vaccine against Ebola even if it could be produced quickly, and given our abysmal track record with pandemic flu vaccination, I tend to agree.
What terrifies health professionals and laypersons about Ebola, despite its thus-far limited impact in the United States, is that so much about it is unknown. Clinicians are prepared to tackle influenza, a known quantity from past years. We don't know what to expect from Ebola, a nebulous threat to cause disaster at any time, like bioterrorism. Until more is known, family physicians should remember that fever in returning international travelers is far more likely to be due to malaria (which turned out to be the diagnosis of the hospitalized patient I mentioned earlier), and to always ask and communicate about recent travel, rather than depending on an electronic medical record to do it.
**
This post first appeared on the AFP Community Blog.
Many have pointed out that even though this is by far the largest and most lethal Ebola outbreak in history, it pales in importance next to more common and contagious viruses such as influenza or measles, or emerging infections closer to home, such as the enterovirus respiratory illness that has stricken children in 46 states. Family physician blogger Mike Sevilla expressed skepticism that patients who continue to decline influenza vaccines in droves would be willing to receive a vaccine against Ebola even if it could be produced quickly, and given our abysmal track record with pandemic flu vaccination, I tend to agree.
What terrifies health professionals and laypersons about Ebola, despite its thus-far limited impact in the United States, is that so much about it is unknown. Clinicians are prepared to tackle influenza, a known quantity from past years. We don't know what to expect from Ebola, a nebulous threat to cause disaster at any time, like bioterrorism. Until more is known, family physicians should remember that fever in returning international travelers is far more likely to be due to malaria (which turned out to be the diagnosis of the hospitalized patient I mentioned earlier), and to always ask and communicate about recent travel, rather than depending on an electronic medical record to do it.
**
This post first appeared on the AFP Community Blog.
Tuesday, October 7, 2014
"An honest economic forecast would have likely sunk Medicare"
"Don't let dead cats stand on your porch." This famous quotation, attributed to President Lyndon Johnson during his strenuous and ultimately successful efforts to pass the 1965 bills that established the Medicare and Medicaid programs, embodied his approach to arguably the most important U.S. health care legislation until the 2010 Affordable Care Act. According to David Blumenthal and James Morone, authors of The Heart of Power, it meant that the best strategy for passing health care (and other potentially controversial) legislation was to act quickly and move bills along in the Congressional process before political opponents or outside advocacy groups had time to organize themselves.
Remarkably, Medicare was fully implemented only 11 months after the bill's signing, overcoming obstacles such as hospital segregation in the South, resistance from physician organizations such as the American Medical Association, and the logistical issues involved in issuing insurance cards to 18 million eligible seniors. As Medicare approaches its 50th anniversary, it faces huge budgetary challenges driven by increasing costs of health care and the demographics of the enormous "Baby Boom" generation, the first member of whom became eligible for Medicare benefits in 2011. This short video produced by the Kaiser Family Foundation summarizes changes that occurred in the program in the intervening years.
Liberal legislators saw Medicare as the first step toward enacting federally-administered universal health insurance for all Americans, while others saw it as a program, like health programs for active-duty military, veterans, and Native Americans, whose benefits were appropriately limited to specific groups and therefore must be defended against encroachment by future wide-ranging health reforms. Princeton professor Paul Starr has called this resistance to change by protected groups the "policy trap" that contributed to the defeat of the Clinton health reform proposal in 1994 and the near-defeat of the Affordable Care Act 16 years later.
Is it true that "an honest economic forecast would have very likely sunk Medicare," as the authors argued? Like every federally financed health insurance initiative to come, Medicare ended up costing substantially more than initially projected. (In fact, the reason that most provisions of the ACA, passed in 2010, didn't take effect until this year was to allow the Congressional Budget Office - which didn't exist in 1965 - to artificially score it as deficit-reducing over a 10-year time period.) Ethical or not, Lyndon Johnson's decision to "lowball" the estimated costs of Medicare was essential to getting it through Congress.
Was President Johnson - the last President to previously hold the position of Senate Majority Leader - a political anomaly, or can lessons from his deft management of the Congressional process be applied to national health care policy today? What do you think about Blumenthal and Marone's lessons for future Presidents, listed below? How well has President Obama adhered to these principles?
1. Presidents must be deeply committed to health reforms.
2. Speed is essential. Waiting makes reforms a lot harder to win.
3. Presidents should concentrate on creating political momentum.
4. Presidents must actively manage the Congressional process.
5. Know when to compromise and know when to push.
6. Pass the credit.
7. Muzzle your economists. First expansion, then cost control.
**
An earlier version of this post first appeared on The Health Policy Exchange.
Remarkably, Medicare was fully implemented only 11 months after the bill's signing, overcoming obstacles such as hospital segregation in the South, resistance from physician organizations such as the American Medical Association, and the logistical issues involved in issuing insurance cards to 18 million eligible seniors. As Medicare approaches its 50th anniversary, it faces huge budgetary challenges driven by increasing costs of health care and the demographics of the enormous "Baby Boom" generation, the first member of whom became eligible for Medicare benefits in 2011. This short video produced by the Kaiser Family Foundation summarizes changes that occurred in the program in the intervening years.
Liberal legislators saw Medicare as the first step toward enacting federally-administered universal health insurance for all Americans, while others saw it as a program, like health programs for active-duty military, veterans, and Native Americans, whose benefits were appropriately limited to specific groups and therefore must be defended against encroachment by future wide-ranging health reforms. Princeton professor Paul Starr has called this resistance to change by protected groups the "policy trap" that contributed to the defeat of the Clinton health reform proposal in 1994 and the near-defeat of the Affordable Care Act 16 years later.
Is it true that "an honest economic forecast would have very likely sunk Medicare," as the authors argued? Like every federally financed health insurance initiative to come, Medicare ended up costing substantially more than initially projected. (In fact, the reason that most provisions of the ACA, passed in 2010, didn't take effect until this year was to allow the Congressional Budget Office - which didn't exist in 1965 - to artificially score it as deficit-reducing over a 10-year time period.) Ethical or not, Lyndon Johnson's decision to "lowball" the estimated costs of Medicare was essential to getting it through Congress.
Was President Johnson - the last President to previously hold the position of Senate Majority Leader - a political anomaly, or can lessons from his deft management of the Congressional process be applied to national health care policy today? What do you think about Blumenthal and Marone's lessons for future Presidents, listed below? How well has President Obama adhered to these principles?
1. Presidents must be deeply committed to health reforms.
2. Speed is essential. Waiting makes reforms a lot harder to win.
3. Presidents should concentrate on creating political momentum.
4. Presidents must actively manage the Congressional process.
5. Know when to compromise and know when to push.
6. Pass the credit.
7. Muzzle your economists. First expansion, then cost control.
**
An earlier version of this post first appeared on The Health Policy Exchange.
Wednesday, October 1, 2014
Lockboxes, Medicare reform, and the myth of "free stuff"
Not long after the 2012 elections, I had an interesting conversation with my dad about the policy debates involving the Medicare program. Since he, along with my mother, is one of the two most important Medicare beneficiaries in my life, hearing his perspective was immensely valuable. Essentially, my dad said that what upset him when politicians described Medicare was the use of the term "entitlement," which implied that people like my parents who paid Medicare taxes for several decades don't deserve to reap the full benefits of that investment.
I pointed out that the reason Medicare is running out of money is that the dollar value of health benefits that seniors use today far exceeds the amount they paid in to the system thirty, twenty, or even ten years ago, since Medicare only began to pay for prescription drugs in 2006 and annual increases in the cost of health care have far exceeded inflation. He countered that it was totally appropriate for retirees to get back more than they put in, since all good investors expect their money to grow over time. He's right. The problem with this argument isn't his fault: the federal government doesn't put revenue from Medicare payroll taxes into the stock market, a savings account, or even the "lockbox" that Al Gore made famous during the 2000 presidential campaign. It spends those dollars, immediately, often on programs that have nothing to do with health care for seniors.
As a nation, we can and should debate the best ways to keep Medicare solvent for my generation and my children's generation. The President and Congress could, for example, turn the program into one with fixed costs but not necessarily fixed benefits. They could agree to large increases in the payroll tax that funds the program, rather than continuing the temporary payroll tax holiday put in to place to cushion families from the worst of the recession. They could cut Medicare payments to doctors by 30 percent, cross their fingers, and hope that at least a few of us would continue to see Medicare patients anyway. They could do some or all of these things at the same time.
What we citizens cannot do is allow them to continue to point fingers at each other and, for purely political reasons, dodge the question of what to do. Which brings me to one of my pet peeves about health reform in general, and the Affordable Care Act in particular: the selling of reforms as good because they provide people who already have health insurance with more "free stuff." Thanks to the ACA / Congressional Democrats / President Obama, a typical political ad will say, women can now get free mammograms, Pap smears, cholesterol tests, and birth control pills! Isn't that great? This kind of ad is misleading because none of the preventive health services defined by the bill suddenly became free. In fact, some cost hundreds or even thousands of dollars. Instead, the costs of these services have just been shifted - into higher insurance premiums, higher deductibles for non-preventive services, to employers, or to the federal government (and therefore the individual taxpayer or an international investor that holds some portion of the U.S.'s $18 trillion national debt).
The above discussion notwithstanding, Medicine and Social Justice blogger Josh Freeman made the very good point that health should generally not be considered a commodity, but a social good. I supported most provisions of the Affordable Care Act because its implementation will eventually allow millions more Americans to more reliably access health care, especially primary care, when they need it. As a family physician, I do not believe that any group of people "deserves" health care any more than others. My dad and mom deserve their health care. But so do I, so do my wife and kids, and so do you and your loved ones. And our country will never have an honest debate about health reform as a social good and a shared sacrifice if we let politicians of both parties, only concerned about the next election, portray it as a false choice between rationing and free stuff.
**
This post first appeared on Common Sense Family Doctor on November 11, 2012.
I pointed out that the reason Medicare is running out of money is that the dollar value of health benefits that seniors use today far exceeds the amount they paid in to the system thirty, twenty, or even ten years ago, since Medicare only began to pay for prescription drugs in 2006 and annual increases in the cost of health care have far exceeded inflation. He countered that it was totally appropriate for retirees to get back more than they put in, since all good investors expect their money to grow over time. He's right. The problem with this argument isn't his fault: the federal government doesn't put revenue from Medicare payroll taxes into the stock market, a savings account, or even the "lockbox" that Al Gore made famous during the 2000 presidential campaign. It spends those dollars, immediately, often on programs that have nothing to do with health care for seniors.
As a nation, we can and should debate the best ways to keep Medicare solvent for my generation and my children's generation. The President and Congress could, for example, turn the program into one with fixed costs but not necessarily fixed benefits. They could agree to large increases in the payroll tax that funds the program, rather than continuing the temporary payroll tax holiday put in to place to cushion families from the worst of the recession. They could cut Medicare payments to doctors by 30 percent, cross their fingers, and hope that at least a few of us would continue to see Medicare patients anyway. They could do some or all of these things at the same time.
What we citizens cannot do is allow them to continue to point fingers at each other and, for purely political reasons, dodge the question of what to do. Which brings me to one of my pet peeves about health reform in general, and the Affordable Care Act in particular: the selling of reforms as good because they provide people who already have health insurance with more "free stuff." Thanks to the ACA / Congressional Democrats / President Obama, a typical political ad will say, women can now get free mammograms, Pap smears, cholesterol tests, and birth control pills! Isn't that great? This kind of ad is misleading because none of the preventive health services defined by the bill suddenly became free. In fact, some cost hundreds or even thousands of dollars. Instead, the costs of these services have just been shifted - into higher insurance premiums, higher deductibles for non-preventive services, to employers, or to the federal government (and therefore the individual taxpayer or an international investor that holds some portion of the U.S.'s $18 trillion national debt).
The above discussion notwithstanding, Medicine and Social Justice blogger Josh Freeman made the very good point that health should generally not be considered a commodity, but a social good. I supported most provisions of the Affordable Care Act because its implementation will eventually allow millions more Americans to more reliably access health care, especially primary care, when they need it. As a family physician, I do not believe that any group of people "deserves" health care any more than others. My dad and mom deserve their health care. But so do I, so do my wife and kids, and so do you and your loved ones. And our country will never have an honest debate about health reform as a social good and a shared sacrifice if we let politicians of both parties, only concerned about the next election, portray it as a false choice between rationing and free stuff.
**
This post first appeared on Common Sense Family Doctor on November 11, 2012.
Subscribe to:
Posts (Atom)