Talk to the Veterans Crisis Line now
U.S. flag
An official website of the United States government

Office of Research & Development

print icon sign up for VA Research updates
thumbnail The Stratification Tool for Opioid Risk Mitigation, which prioritizes review of patients receiving opioids based on their risk for overdose, accident, or suicide, is one of four VA programs being rigorously evaluated with the help of VA researchers. (Photo: ©iStock/DNY59)

The Stratification Tool for Opioid Risk Mitigation, which prioritizes review of patients receiving opioids based on their risk for overdose, accident, or suicide, is one of four VA programs being rigorously evaluated with the help of VA researchers. (Photo: ©iStock/DNY59)

VA aims to improve effectiveness of programs through randomized evaluations

September 13, 2018

By Mike Richman
VA Research Communications

In 2016, a bill that was aimed at making the U.S. government more effective and efficient became law. The Evidence-Based Policymaking Commission Act was enacted to improve the ability of researchers, evaluators, and statisticians to use information collected by the government to make important policy decisions.

The legislation directed the Office of Management and Budget (OMB) to ensure that government agencies become more rigorous in how they evaluate existing programs. In the case of VA, OMB is providing more than $10 million annually to create, in part, a funding mechanism for randomized program evaluations (RPEs) of the care Veterans receive and the efficiency with which it's delivered. The RPEs would be carried out in much the same way a randomized clinical trial is used to determine which medical treatments work best.

VA has allocated some of those dollars to support its first RPEs, with the agency's Health Services Research & Development service (HSR&D) funding randomized evaluations of four VA programs. VA's Quality Enhancement Research Initiative (QUERI), which is under HSR&D, funds the resource center that is consulting on the design, the Partnered Evidence-based Policy Resource Center (PEPReC). PEPReC is devoted to methods that facilitate evidence-based policy development and practice change across VA.

A VA-funded paper whose lead author was Dr. Austin Frakt, a health economist at the VA Boston Healthcare System, reported key lessons learned in the design of the four randomized program evaluations. He and his team hope that conveying the challenges they faced will help other VA researchers who may want to conduct similar work.

"We were engaged in a systematic effort to design and initiate four large randomized program evaluations across VA."

PEPReC investigators conducted the evaluations, with help from researchers elsewhere in VA. They analyzed the design of RPEs of a home- and community-based care program for Veterans at risk of nursing home placement; an opioid risk mitigation program; and an initiative for suicide risk assessment and prevention.

In developing the RPEs, Frakt and his team tried to overcome a host of issues that often surface in designing randomized controlled trials.

They explained that the successful development of randomized evaluations requires more than attention to traditional design issues, such as ensuring the availability of data. Success also calls for stable financing, a commitment to adhering to a randomized roll-out, a degree of acceptance from staff, and the ability to manage situations when officials show opposition to how the evaluation is being carried out.

'It's challenging even to do one'

Some of the findings in the paper were not at all surprising. Case in point: The researchers noted that successful program implementation and rigorous evaluation require resources, specialized expertise, and careful planning. If the health care system's ability to innovate and grow is to be sustained, organizations will need dedicated programs to prioritize resources and continuously adapt evaluation designs, the researchers indicated.

"We were engaged in a systematic effort to design and initiate four large randomized program evaluations across VA," says Frakt, who is the director of PEPReC and an associate professor at Boston University. "It's a challenging undertaking even to do one. The research community in general doesn't have an opportunity to do these kinds of studies very often. Randomization is hard. We thought that conveying some of the challenges we encountered and some of what we overcame would be useful for the research community."

He adds: "We don't want people to have to have to reinvent the wheel. If we can suggest ways they can avoid some problems and delays and therefore make their program come off more efficiently, that would be a good thing."

These four VA programs were selected for randomized evaluation of the design. Three of the four evaluations were designed with a randomized arm and a control arm:

  • Veteran-Directed Home and Community Based Services (VD-HCBS), a program that aims to reduce nursing home admissions for Veterans with functional and cognitive limitations (see sidebar).
  • The Stratification Tool for Opioid Risk Mitigation (STORM), which prioritizes review of patients receiving opioids based on their risk for overdose, accident, or suicide.
  • Recovery Engagement and Coordination for Health Veterans Enhanced Treatment (REACH VET), which uses predictive modeling and medical record data to identify Veterans at highest risk for suicide.
  • VA Telederm, which features two mobile apps that are supposed to improve access for VA patients to dermatology care, particularly Veterans in rural areas. (This program is on hold due to VA's transition to a new electronic medical records system. VA is adopting the health records system used by the Department of Defense, to allow for seamless care between the agencies.)

In his paper, Frakt and his team listed examples of lessons learned in the design stage of the randomized evaluations, including these two:

  • Regarding insufficient buy-in at all levels, the researchers note that few programs are implemented solely by organizational leaders. Therefore, even with strong support from the top, a program can fail to be implemented as designed without buy-in and cooperation from subordinates. "Unfortunately, it is not easy to tell in advance when boots-on-the-ground staff are not invested in a randomized design," they write. "But this can be overcome. In addition, "Many programmatic decisions are made at the local level—not the national level," one of Frakt's research collaborators said in the paper. "As a result, building a network of engaged local leaders is a critical step to ensuring the development and initiation of the program and the measurement of the implementation, as well."
  • The roll-out of the program can get ahead of the randomized design, especially in cases of a staggered roll-out format. Some sites may want to begin before their assigned start date, threatening the validity of the findings in the study. Communication is key to preventing such problems, according to a researcher who was cited in Frakt's paper. "Though accommodating a (randomized controlled trial or randomized program evaluation) takes time, it can ultimately help defend a program that works or support withdrawing resources from one that does not," the researchers write. "Communicating that should include advocacy up the chain in slowing roll-out to accommodate rigorous evaluation."

VA a model for doing randomized program evaluations

QUERI Director Dr. Amy Kilbourne says the information produced by the evaluations is key to determining if a program works and how to get the most out of it.

"It's a very efficient way of doing pragmatic research and to get a question answered in the time frame required by many of our operations leaders, who work at a very fast pace," Kilbourne says. "The randomized program evaluation mechanism also solves another problem: Things sometimes get rolled out quickly without planning or adequate attention to the details of how you design a program to not only make sure it's implementable and can be used in routine practice, or that it's a policy that can be applied in VA practices around the country, but also that the program can be sustained. We always come up with these ideas for programs or policies. They sound good. But we never know if they actually work. The only way to really determine that is to randomize them."

VA is one of the first health care systems, including those in the private sector, to systematically adopt a funding mechanism for RPEs in response to the evidence-based policy plan that OMB has been pushing, Kilbourne notes.

"It's been a government-wide priority through OMB for at least 20 years," she says. "And VA has been the only health care system and probably one of the only government agencies with a national mechanism to enable researchers to do randomized program evaluations to ultimately help VA improve its functioning and organization."

She says the "closest cousin" to VA in this regard is the NIH Health Care Systems Research Collaboratory. In part, the endeavor supports the design and rapid execution of pragmatic clinical trial demonstration projects that address questions of major public health importance and engage health care systems in research partnerships. These projects help to establish best practices and provide proof of concept for innovative designs in clinical research.

"But NIH doesn't have its own embedded health care system like VA," Kilbourne says. "So we're pretty much the model of the health care system by doing these randomized program evaluations. That's why it's so crucial to have embedded internal-funding programs like HSR&D and QUERI to do scientifically supported and scientifically reviewed randomized program evaluations that address top priorities in the VA system."

VA to continue soliciting for program evaluation ideas

The randomized evaluations featured in Frakt's study were chosen by HSR&D and QUERI leadership, after HSR&D put out a national call for topics. VA national program offices submitted the names of more than 35 programs, four of which were selected based on rigorous criteria that QUERI developed with PEPReC. The criteria included whether a randomized program evaluation would be feasible, given the nature of the program; whether the program was a high-priority VA topic; and whether the program officials provided a clear statement of how they'd apply the results.

"Would they tweak the program, would they design a better one, would they implement it nationally, things like that," Kilbourne says.

Frakt's paper has limitations. Most notably, the investigators could assess and try to overcome challenges that arose only in the design phase of the four randomized evaluations. "It's possible that other challenges will arise in subsequent phases of implementation and evaluation that we cannot anticipate at this stage," they write.

According to Kilbourne, VA program officials are starting to implement the findings from the randomized program evaluations for the STORM, REACH VET, and VD-HCBS programs. She says it's too early to cite any of the results from the implementation stage, noting that a few other randomized evaluations of VA programs are in the process of starting.

"Now that VA program offices are aware of this opportunity," she says, "we opened the door for any VA investigator to apply if he or she has an operational partner, a program office lead that's interested in doing a randomized evaluation. We just finished our last cycle of reviewing them. We're going to continue soliciting for innovative randomized program evaluation ideas through this funding mechanism."

Veteran Directed Home and Community Based Services program. (Photo for illustrative purposes only: ©iStock/FredFroese)
The Veteran Directed Home and Community Based Services program, recently evaluated with the help of VA investigators, provides Veterans with opportunities to self-direct their support system so they can continue to live at home. (Photo for illustrative purposes only: ©iStock/FredFroese)

The Veteran Directed Home and Community Based Services program, recently evaluated with the help of VA investigators, provides Veterans with opportunities to self-direct their support system so they can continue to live at home. (Photo for illustrative purposes only: ©iStock/FredFroese)

A decade ago, the U.S. Department of Health and Human Services, through its agency the Administration for Community Living, began a partnership with VA to serve Veterans of all ages who may be at risk of nursing home placement.

VA and the Administration for Community Living (ACL) launched the Veteran Directed Home and Community Based Services (VD-HCBS) program, which provides Veterans with opportunities to self-direct their support system so they can continue to live at home.

The VD-HCBS program is one of the four VA initiatives that was highlighted in a VA-funded paper that reported lessons learned in the design of four randomized program evaluations. In the case of VD-HCBS, the researchers had determined that every VA medical center that didn't offer the program as of March 2017 would receive it as part of the evaluation. That means 77 VA sites would be included in the design evaluation.

To obtain VD-HCBS and control data from every site, while still allowing each site to receive the program, the investigators decided to randomize the time at which the sites would be referring patients to the program—rather than assign those sites to provide VD-HCBS or not. They designed a plan in which 14 sites would be randomized to refer patients to VD-HCBS every 6 to 10 months. The rest would continue offering their standard method of care to Veterans who may need referral to a nursing home.

Staggering the start date is a way of preventing bias, says Dr. Melissa Garrido, the associate director of VA's Partnered Evidence-based Policy Resource Center, which designed and initiated the evaluation of the VD-HCBS program. She explains that staggering, as a form of randomization, helps investigators ensure that any effects they observe are due to VD-HCBS, rather than to unrelated circumstances occurring at the same time the program is implemented. Those circumstances could involve a policy change, for example, or another program for Veterans who may need nursing homes.

The VA-funded paper cited challenges that the investigators faced in designing the four randomized program evaluations. Regarding VD-HCBS, they could not determine a roll-out schedule for each of the 77 sites up front. Randomization could occur only for sites that had buy-in from on-site officials and funding to hire a local coordinator to oversee the program, and that had gone through a VHA-ACL interagency preparation process lasting several months. In other words, sites will be selected in the order in which they are ready to provide the program.

"The initiation and completion of this process were outside our control," the researchers write. "For these reasons, randomization was to occur every 6-10 months and limited to sites that were ready for the intervention. With a small number of sites that were available for randomization in each wave, we were concerned that simple randomization (by a coin flip, for instance) may lead to imbalance across study arms."

'Rigorous evidence' needed to determine effectiveness of program

The VD-HCBS program calls, in part, for giving Veterans a monthly budget to pay for the services or support they need to stay in their home. That may involve hiring a caregiver or home health aide, or installing handlebars or other modifications to the home to improve accessibility and help avoid injury. Standard care could mean referral to a medical foster home for Veterans where a caretaker oversees their daily needs.

Based on preliminary evidence, the VD-HCBS program should improve Veteran and caregiver satisfaction and reduce nursing home admissions, according to Garrido. She says the evidence is from VA sites that have implemented the program, not including those in the group of 77.

"However, we need rigorous evidence of whether VD-HCBS improves care for Veterans who are at risk of nursing home placement," says Garrido, who is also a health services researcher at VA Boston. "We're trying to obtain evidence to see whether the program reduces nursing home admissions that are unwanted, whether it reduces emergency department visits or inpatient care, and whether it increases caregiver and Veteran satisfaction. We have preliminary evidence suggesting that this program will do these kinds of things, but this is the first time we've tried to study it in such a rigorous, staggered, randomized manner."

--- Mike Richman




VA Research Currents archives || Sign up for VA Research updates



Questions about the R&D website? Email the Web Team

Any health information on this website is strictly for informational purposes and is not intended as medical advice. It should not be used to diagnose or treat any condition.