Reexamined Clinical Trials Can Point to New Conclusions

Reexamined Clinical Trials Can Point to New Conclusions

TUESDAY, Sept. 9, 2014 (HealthDay News) — Clinical trials are an invaluable tool in modern medicine, enabling scientists to gauge the safety and effectiveness of new drugs and medical devices.

But there’s a flaw in the current system, a new study argues. Researchers conducting a clinical trial usually keep the raw data to themselves, preventing independent outsiders from double-checking their findings.

As many as one-third of randomized clinical trials could be reanalyzed in ways that change conclusions of how new or existing drugs should be used to treat patients. So say Stanford University researchers in a Sept. 9 report in the Journal of the American Medical Association.

Interpretation of clinical trial data can have serious implications in the hospital, clinic or pharmacy, said study senior author Dr. John Ioannidis, director of the Stanford Prevention Research Center.

“Randomized trials are at the hub of making decisions about whether these are drugs or devices we want to use on real patients in real life,” Ioannidis said.

Sharing of raw clinical trial data is so rare that his team could find only 37 published reanalyses out of thousands of papers representing more than three decades of research.

When they reviewed the results of those reanalyses, the researchers found 35 percent of the follow-up reports came to different conclusions from those of the original trial.

Nine found that more patients could be treated, one found that fewer patients should be treated, and three concluded that different parameters should be used to determine which patients should be treated.

For example, a clinical trial evaluating the treatment of enlarged, bleeding veins in the esophagus concluded that injecting the veins with chemicals to induce blood clots could save lives, even though the therapy didn’t prevent future bleeding.

A second look using a different statistical model of risk came to the opposite determination, finding the treatment prevented future bleeding but didn’t reduce deaths. The new conclusion suggested the treatment would be best for patients more likely to resume bleeding, rather than those at highest risk of death from the condition.

Most of the time, the follow-up researchers came to different conclusions in their reanalyses because they used different statistical methods or applied new medical knowledge to reinterpret the old clinical trial data, Ioannidis said.

One study cited in the report came to a different conclusion regarding the recommended use of an anemia medication, because the follow-up researchers applied updated therapeutic standards to the data.

“There are new insights and different opinions if you have a different group interpret the data, and that represents the degree to which people can analyze data differently,” said Dr. Eric Peterson, a cardiologist at Duke University Medical Center, who wrote an editorial accompanying the Stanford study.

In rarer cases, the original research team made a mistake. “There were clear errors in the original analysis, and correcting these errors could change the results and alter the conclusions,” Ioannidis said.

Unconscious bias also can affect the way clinical trial data is analyzed and reanalyzed, said George Prendergast, CEO of the Lankenau Institute for Medical Research in Wynnewood, Pa., and editor-in-chief of Cancer Research.

“These are not biases that result from fraud. These are biases that result from being human,” Prendergast said, noting scientists often make choices based on their prior education or experience. “The brain is a pattern-generating tool, and it tends to want to generate patterns that come from previous experience.”

Up to now, raw clinical trial data has been closely held by the original researchers, usually because the trial has been funded by companies that have poured tens of millions of dollars into the effort, Prendergast said.

However, there appears to be an industry trend toward making the raw data available to outside investigators, he added.

“There’s a realization that if criticisms are handled earlier, in the long run it saves time, energy, conflict and cost,” Prendergast said.

While it may be difficult to free the data from concluded clinical trials, Ioannidis said data from all future trials should be made public.

“If data are not to be made available, researchers should have to make an argument, and it has to be a very convincing argument,” he said. “And I cannot think of any strong arguments.”

More information

For more on clinical trials, visit the U.S. National Institutes of Health.