Editor's note: When we read Sharon Long and colleagues' retrospective on Massachusetts health reform at 10 years in the September issue of Health Affairs we were reminded just what a busy year 2006 was for health policy writ large (in addition to wondering 'Has it been that long already?'). Now a decade later, we think there's something to be gained from looking back on the impact and reach of some of the most significant policies implemented that year. With that in mind, Health Affairs Blog invited a handful of policy makers and researchers to reflect on some of these major milestones, share lessons learned, and discuss how our world has changed since then. Visit the Blog for more posts in this occasional series.
Among the significant milestones in our ongoing effort to vanquish disease caused by the human immunodeficiency virus (HIV) was the development of a blood test, in 1985, that enabled the reliable diagnosis of those who'd been infected with the virus. That's not to say that the test was universally acclaimed as a good thing. At that stage of our interaction with the HIV epidemic, the prognostic value of a positive test was uncertain, there were no effective treatments against the virus and opportunities for discrimination against those who were infected, or perceived to be infected, abounded. In fact, in the early years following the licensure of the HIV test, some advocacy groups cautioned at-risk persons to avoid—or at least be wary of—taking the test, given its uncertainties and the potential for discrimination based on HIV antibody serostatus. Readers who didn't live through those days may find it hard to believe that there was more than one voice calling for widespread mandatory HIV testing and that even more extreme proponents dared to suggest isolation and quarantine measures for people who were found to have a positive HIV antibody test. Thankfully, mainstream public health leaders took the high road and made HIV testing services available on both a confidential and an "anonymous" basis so that those who did not wish to provide their names could still learn whether they'd been infected.
Time passed, hard-won knowledge accrued and legislation was enacted that provided stronger protections for those who were living with HIV disease. Not to say that fear and discrimination were eliminated, but that more powerful tools were made available to confront the irrational and harmful responses that often arose during those first, darkest years of the epidemic. The first treatment for HIV (AZT or zidovudine) was licensed in 1987 — six years after the Centers for Disease Control and Prevention's (CDC) initial case reports of AIDS (acquired immune deficiency syndrome). But it wasn't until the protease inhibitor Saquinavir was licensed in late 1995 that clinicians finally had an effective combination of drugs that could durably reduce circulating levels of virus and thereby interrupt the relentless destruction of the immune system that, before then, had resulted in hundreds of thousands of infected persons progressing to AIDS. So potent was the effect of these new drug combinations that decreases in national HIV-related mortality rates were observed a scant year after their licensure.
Context Mattered
HIV testing technologies have also improved over time. Newer tests have featured ever-shortening "window periods" (i.e., the length of time between actual infection and ability to detect viral antibodies) and test samples that could be obtained by finger prick or oral fluids rather than the more intrusive venipuncture traditionally associated with medical testing. But scientific advances notwithstanding, arguably, the greatest change between 1985, when HIV testing first became available to 2006 when CDC issued new guidelines calling for routine HIV testing in health care settings, was one of context. Although in both eras a positive HIV diagnosis was cause for anxiety, confirming, as it did, the presence of a serious, life-long illness, the context of HIV disease had markedly changed from a hopeless, fatal disease to a chronic condition that, if properly managed, could be associated with many years of productive life.
A major rationale for updating the recommendations for HIV testing in health-care settings in 2006 was the accumulation of evidence that traditional testing approaches relying on a person's willingness to admit to "risk behaviors" as well as clinicians' comfort discussing matters of sex and drug use with patients, was missing far too many individuals who were infected with HIV. At the time the guidelines were released in late September 2006, the CDC estimated that there may be as many as 312,000 persons in the U.S. infected with HIV and not yet diagnosed. Of equal concern was the fact that far too many people who were diagnosed with HIV—nearly 40 percent—developed AIDS within a year of their diagnosis. Meaning that they'd been infected with the virus for several years without being aware of it, unknowingly putting their sexual partners at risk for infection and allowing the virus to inexorably erode their own immune systems.
Providing HIV testing to all adults seen in health care settings without having to elicit a history of risk behavior was, and is, an important step toward normalizing and de-stigmatizing HIV testing. But we should be clear that CDC's 2006 guidelines did not call for HIV testing without a patient's consent. Instead, they recommend that HIV testing in health care settings be conducted the same way that many other clinical laboratory tests are handled. Namely, that the patient be notified, in advance, that testing would be performed — with the right to decline the testing. And in 2013, when the U.S. Preventive Services Task Force (USPSTF) updated their previous HIV testing recommendations granting an "A" rating (i.e., evidence supports a "high certainty" of substantial benefit) to the recommendation that clinicians screen all adolescents and adults aged 15 to 65 years for HIV infection, it added momentum to the push for routine HIV testing in all medical care settings.
Beyond Guidelines
A decade has passed since CDC's groundbreaking recommendations, which begs the question "How are we doing?" According to a progress report on the National HIV/AIDS Strategy released by the White House Office of National AIDS Policy in July 2016, as of 2013, an estimated 87 percent of persons living with HIV knew their serostatus — movement forward, to be sure, but still a substantial minority of infected persons (13 percent) who haven't been diagnosed. Also in July 2016, the Kaiser Family Foundation released data from 2013 & 2014 indicating that over half (56 percent) of a national sample of gay and bisexual men reported that their health care providers had never suggested that they be tested for HIV. And in a 2013 web-based survey of 137 primary care physicians practicing in community health centers in the Houston, Texas area, 41 percent of respondents were unaware of the CDC's updated 2006 HIV testing recommendations.
As all of us know, guidelines do not magically change practice. Instead, they bring together accumulated experience and existing evidence to make a compelling case for the "best" way to carry out a process or practice — in this case, how to implement routine HIV testing in health care settings so that persons who are infected are diagnosed in a timely manner and linked into care. Guidelines are the beginning of a journey toward optimal care for HIV disease, not the endpoint. Facilitating the uptake of guidelines into practice requires making adjustments to clinical education, educating and empowering consumers and sometimes, making changes to systems of health care delivery and reimbursement. When it comes to ensuring routine HIV testing in all health care settings in the United States, it's clear that we have more work to do. Perhaps Goethe's famous aphorism, adopted by the Institute of Medicine, best describes the current state of affairs: "Knowing is not enough, we must apply; Willing is not enough, we must do."
from Health Affairs BlogHealth Affairs Blog http://ift.tt/2d0erUR
No comments:
Post a Comment