On April 22, researchers at Northwell Health, a major hospital system in the New York area, reported a stunning 88% death rate among Covid-19 patients on ventilators. The number was published in the prestigious Journal of the American Medical Association and made headlines across the country.

Two days later, the journal issued a clarification amid a flurry of harsh criticism from scientists on Twitter. The abstract replaced the frightening 88% figure with a far more reassuring metric: Only 24.5% of coronavirus patients on ventilators at Northwell Health had died so far, the new version said. Yet none of the underlying mortality data in the study had changed.

What happened? And which number is right? As it turns out, both are. The higher percentage is based on a tiny number of patients who either died or recovered within days. The lower percentage adjusts the death figures to include everyone on ventilators who was still alive and battling the virus at the time the study ended. Many of them may not survive, but some may.

“The numbers in the article were accurate,” says Karina Davidson, senior vice president of research at Northwell Health, referring to the original mortality calculations. “There was so much misinterpretation” of the 88% mortality rate that Northwell opted for clarification.

The Northwell experience demonstrates how early research released in the middle of a health crisis can result in more confusion than clarification. In the push for medical answers, even top researchers and journals are having trouble getting the balance right. Their usually cautious and codified world of scientific research has suddenly morphed into a dizzying race to the finish line in the age of Covid-19. Medical studies are being pumped out faster than ever before. And public officials are struggling to interpret this gusher of data to make quick decisions that could affect the health and lives of billions of people around the world.

Drug Study
The Northwell Health study is one of the latest in a dramatic streak of confusing reports. Just days after JAMA clarified Northwell’s ventilator death rate, drug maker Gilead Sciences Inc. issued a press release stoking excitement  that it was nearing a breakthrough with its experimental treatment remdesivir, a drug seen as holding great promise for an early Covid-19 treatment. Hours after the company announced preliminary test results, Anthony Fauci, the National Institute of Allergy and Infectious Diseases director, touted a U.S. government study at a White House event, saying the drug had met its targets.

Fauci’s comments buoyed a stock-market rally propelled earlier by Gilead’s press release. But there was just one problem: On the very same day, The Lancet, a prestigious journal, published results from a small Chinese study that showed very different and less-promising results.

“It’s not just a single trial that’s going to hold the truth here,” said John Norrie, a professor of medical statistics and trial methodology at the University of Edinburgh. “The bottom line is, you can’t rely on press releases,” he said, which often contain a fraction of the information that’s in a full study.

Signs of the medical-data frenzy are everywhere. In recent weeks, numerous studies screening for antibodies — a sign of past illnes— found widely varying rates of infection in the population. The studies have been criticized by other scientists for their methodologies, including placing too much confidence in tests that may produce false positives.

Stanford University researchers, who conducted one of the studies, also were questioned by doctors and scientists for relying on Facebook to seek volunteers, which may have skewed the findings by attracting people who previously had symptoms. On April 30, the authors posted an updated paper, more clearly laying out the study limitations and including more data on control samples, presenting a more-conservative estimate for infection prevalence rather than a range.

First « 1 2 3 » Next