• Skip to main content
  • Skip to primary sidebar
virology blog

virology blog

About viruses and viral disease

clinical trial

An open letter to Psychological Medicine about “recovery” and the PACE trial

13 March 2017 by Vincent Racaniello

Sir Robin Murray and Dr. Kenneth Kendler
Psychological Medicine
Cambridge University Press
University Printing House
Shaftesbury Road
Cambridge CB2 8BS
UK

Dear Sir Robin Murray and Dr. Kendler:

In 2013, Psychological Medicine published an article called “Recovery from chronic fatigue syndrome after treatments given in the PACE trial.”[1] In the paper, White et al. reported that graded exercise therapy (GET) and cognitive behavioural therapy (CBT) each led to recovery in 22% of patients, compared with only 7% in a comparison group. The two treatments, they concluded, offered patients “the best chance of recovery.”

PACE was the largest clinical trial ever conducted for chronic fatigue syndrome (also known as myalgic encephalomyelitis, or ME/CFS), with the first results published in The Lancet in 2011.[2] It was an open-label study with subjective primary outcomes, a design that requires strict vigilance to prevent the possibility of bias. Yet PACE suffered from major flaws that have raised serious concerns about the validity, reliability and integrity of the findings.[3] Despite these flaws, White et al.’s claims of recovery in Psychological Medicine have greatly impacted treatment, research, and public attitudes towards ME/CFS.

According to the protocol for the PACE trial, participants needed to meet specific benchmarks on four different measures in order to be defined as having achieved “recovery.”[4] But in Psychological Medicine, White et al. significantly relaxed each of the four required outcomes, making “recovery” far easier to achieve. No PACE oversight committees appear to have approved the redefinition of recovery; at least, no such approvals were mentioned. White et al. did not publish the results they would have gotten using the original protocol approach, nor did they include sensitivity analyses, the standard statistical method for assessing the impact of such changes.

Patients, advocates and some scientists quickly pointed out these and other problems. In October of 2015, Virology Blog published an investigation of PACE, by David Tuller of the University of California, Berkeley, that confirmed the trial’s methodological lapses.[5] Since then, more than 12,000 patients and supporters have signed a petition calling for Psychological Medicine to retract the questionable recovery claims. Yet the journal has taken no steps to address the issues.

Last summer, Queen Mary University of London released anonymized PACE trial data under a tribunal order arising from a patient’s freedom-of-information request. In December, an independent research group used that newly released data to calculate the recovery results per the original methodology outlined in the protocol.[6] This reanalysis documented what was already clear: that the claims of recovery could not be taken at face value.

In the reanalysis, which appeared in the journal Fatigue: Biomedicine, Health & Behavior, Wilshire et al. reported that the PACE protocol’s definition of “recovery” yielded recovery rates of 7 % or less for all arms of the trial. Moreover, in contrast to the findings reported in Psychological Medicine, the PACE interventions offered no statistically significant benefits. In conclusion, noted Wilshire et al., “the claim that patients can recover as a result of CBT and GET is not justified by the data, and is highly misleading to clinicians and patients considering these treatments.”

In short, the PACE trial had null results for recovery, according to the protocol definition selected by the authors themselves. Besides the inflated recovery results reported in Psychological Medicine, the study suffered from a host of other problems, including the following:

*In a paradox, the revised recovery thresholds for physical function and fatigue–two of the four recovery measures–were so lax that patients could deteriorate during the trial and yet be counted as “recovered” on these outcomes. In fact, 13 % of participants met one or both of these recovery thresholds at baseline. White et al. did not disclose these salient facts in Psychological Medicine. We know of no other studies in the clinical trial literature in which recovery thresholds for an indicator actually represented worse health status than the entry thresholds for serious disability on the same indicator.

*During the trial, the authors published a newsletter for participants that included glowing testimonials from earlier participants about their positive outcomes in the trial.[7] An article in the same newsletter reported that a national clinical guidelines committee had already recommended CBT and GET as effective; the newsletter article did not mention adaptive pacing therapy, an intervention developed specifically for the PACE trial. The participant testimonials and the newsletter article could have biased the responses of an unknown number of the two hundred or more people still undergoing assessments—about a third of the total sample.

*The PACE protocol included a promise that the investigators would inform prospective participants of “any possible conflicts of interest.” Key PACE investigators have had longstanding relationships with major insurance companies, advising them on how to handle disability claims related to ME/CFS. However, the trial’s consent forms did not mention these self-evident conflicts of interest. It is irrelevant that insurance companies were not directly involved in the trial and insufficient that the investigators disclosed these links in their published research. Given this serious omission, the consent obtained from the 641 trial participants is of questionable legitimacy.

Such flaws are unacceptable in published research; they cannot be defended or explained away. The PACE investigators have repeatedly tried to address these concerns. Yet their efforts to date—in journal correspondence, news articles, blog posts, and most recently in their response to Wilshire et al. in Fatigue[8]—have been incomplete and unconvincing.

The PACE trial compounded these errors by using a case definition for the illness that required only one symptom–six months of disabling, unexplained fatigue. A 2015 report from the U.S. National Institutes of Health recommended abandoning this single-symptom approach for identifying patients.[9] The NIH report concluded that this broad case definition generated heterogeneous samples of people with a variety of fatiguing illnesses, and that using it to study ME/CFS could “impair progress and cause harm.”

PACE included sub-group analyses of two alternate and more specific case definitions, but these case definitions were modified in ways that could have impacted the results. Moreover, an unknown number of prospective participants might have met these alternate criteria but been excluded from the study by the initial screening.

To protect patients from ineffective and possibly harmful treatments, White et al.’s recovery claims cannot stand in the literature. Therefore, we are asking Psychological Medicine to retract the paper immediately. Patients and clinicians deserve and expect accurate and unbiased information on which to base their treatment decisions. We urge you to take action without further delay.

Sincerely,

Dharam V. Ablashi, DVM, MS, Dip Bact
Scientific Director
HHV-6 Foundation
Former Senior Investigator
National Cancer Institute
National Institutes of Health
Bethesda, Maryland, USA

James N. Baraniuk, MD
Professor, Department of Medicine
Georgetown University
Washington, D.C., USA

Lisa F. Barcellos, MPH, PhD
Professor of Epidemiology
School of Public Health
California Institute for Quantitative Biosciences
University of California, Berkeley
Berkeley, California, USA

Lucinda Bateman, MD
Medical Director
Bateman Horne Center
Salt Lake City, Utah, USA

Alison C. Bested, MD, FRCPC
Clinical Associate Professor
Faculty of Medicine
University of British Columbia
Vancouver, British Columbia, Canada

Molly Brown, PhD
Assistant Professor
Department of Psychology
DePaul University
Chicago, Illinois, USA

John Chia, MD
Clinician and Researcher
EVMED Research
Lomita, California, USA

Todd E. Davenport, PT, DPT, MPH, OCS
Associate Professor
Department of Physical Therapy
University of the Pacific
Stockton, California, USA

Ronald W. Davis, PhD
Professor of Biochemistry and Genetics
Stanford University
Stanford, California, USA

Simon Duffy, PhD, FRSA
Director
Centre for Welfare Reform
Sheffield, UK

Jonathan C.W. Edwards, MD
Emeritus Professor of Medicine
University College London
London, UK

Derek Enlander, MD
New York, New York, USA

Meredyth Evans, PhD
Clinical Psychologist and Researcher
Chicago, Illinois, USA

Kenneth J. Friedman, PhD
Associate Professor of Physiology and Pharmacology (retired)
New Jersey Medical School
University of Medicine and Dentistry of New Jersey
Newark, New Jersey, USA

Robert F. Garry, PhD
Professor of Microbiology and Immunology
Tulane University School of Medicine
New Orleans, Louisiana, USA

Keith Geraghty, PhD
Honorary Research Fellow
Division of Population Health, Health Services Research & Primary Care
School of Health Sciences
University of Manchester
Manchester, UK

Ian Gibson, PhD
Former Member of Parliament for Norwich North
Former Dean, School of Biological Sciences
University of East Anglia
Honorary Senior Lecturer and Associate Tutor
Norwich Medical School
University of East Anglia
Norwich, UK

Rebecca Goldin, PhD
Professor of Mathematics
George Mason University
Fairfax, Virginia, USA

Ellen Goudsmit, PhD, FBPsS
Health Psychologist (retired)
Former Visiting Research Fellow
University of East London
London, UK

Maureen Hanson, PhD
Liberty Hyde Bailey Professor
Department of Molecular Biology and Genetics
Cornell University
Ithaca, New York, USA

Malcolm Hooper, PhD
Emeritus Professor of Medicinal Chemistry
University of Sunderland
Sunderland, UK

Leonard A. Jason, PhD
Professor of Psychology
DePaul University
Chicago, Illinois, USA

Michael W. Kahn, MD
Assistant Professor of Psychiatry
Harvard Medical School
Boston, Massachusetts, USA

Jon D. Kaiser, MD
Clinical Faculty
Department of Medicine
University of California, San Francisco
San Francisco, California, USA

David L. Kaufman, MD
Medical Director
Open Medicine Institute
Mountain View, California, USA

Betsy Keller, PhD
Department of Exercise and Sports Sciences
Ithaca College
Ithaca, New York, USA

Nancy Klimas, MD
Director, Institute for Neuro-Immune Medicine
Nova Southeastern University
Director, Miami VA Medical Center GWI and CFS/ME Program
Miami, Florida, USA

Andreas M. Kogelnik, MD, PhD
Director and Chief Executive Officer
Open Medicine Institute
Mountain View, California, USA

Eliana M. Lacerda, MD, MSc, PhD
Clinical Assistant Professor
Disability & Eye Health Group/Clinical Research Department
Faculty of Infectious and Tropical Diseases
London School of Hygiene & Tropical Medicine
London, UK

Charles W. Lapp, MD
Medical Director
Hunter-Hopkins Center
Charlotte, North Carolina, USA
Assistant Consulting Professor
Department of Community and Family Medicine
Duke University School of Medicine
Durham, North Carolina, USA

Bruce Levin, PhD
Professor of Biostatistics
Columbia University
New York, New York, USA

Alan R. Light, PhD
Professor of Anesthesiology
Professor of Neurobiology and Anatomy
University of Utah
Salt Lake City, Utah, USA

Vincent C. Lombardi, PhD
Director of Research
Nevada Center for Biomedical Research
Reno, Nevada, USA

Alex Lubet, PhD
Professor of Music
Head, Interdisciplinary Graduate Group in Disability Studies
Affiliate Faculty, Center for Bioethics
Affiliate Faculty, Center for Cognitive Sciences
University of Minnesota
Minneapolis, Minnesota, USA

Steven Lubet
Williams Memorial Professor of Law
Northwestern University Pritzker School of Law
Chicago, Illinois, USA

Sonya Marshall-Gradisnik, PhD
Professor of Immunology
Co-Director, National Centre for Neuroimmunology and Emerging Diseases
Griffith University
Queensland, Australia

Patrick E. McKnight, PhD
Professor of Psychology
George Mason University
Fairfax, Virginia, USA

Jose G. Montoya, MD, FACP, FIDSA
Professor of Medicine
Division of Infectious Diseases and Geographic Medicine
Stanford University School of Medicine
Stanford, California, USA

Zaher Nahle, PhD, MPA
Vice President for Research and Scientific Programs
Solve ME/CFS Initiative
Los Angeles, California, USA

Henrik Nielsen, MD
Specialist in Internal Medicine and Rheumatology
Copenhagen, Denmark

James M. Oleske, MD, MPH
François-Xavier Bagnoud Professor of Pediatrics
Senator of RBHS Research Centers, Bureaus, and Institutes
Director, Division of Pediatrics Allergy, Immunology & Infectious Diseases
Department of Pediatrics
Rutgers New Jersey Medical School
Newark, New Jersey, USA

Elisa Oltra, PhD
Professor of Molecular and Cellular Biology
Catholic University of Valencia School of Medicine
Valencia, Spain

Richard Podell, MD, MPH
Clinical Professor
Department of Family Medicine
Rutgers Robert Wood Johnson Medical School
New Brunswick, New Jersey, USA

Nicole Porter, PhD
Psychologist in Private Practice
Rolling Ground, Wisconsin, USA

Vincent R. Racaniello, PhD
Professor of Microbiology and Immunology
Columbia University
New York, New York, USA

Arthur L. Reingold, MD
Professor of Epidemiology
University of California, Berkeley
Berkeley, California, USA

Anders Rosén, MD
Professor of Inflammation and Tumor Biology
Department of Clinical and Experimental Medicine
Division of Cell Biology
Linköping University
Linköping, Sweden

Peter C. Rowe, MD
Professor of Pediatrics
Johns Hopkins University School of Medicine
Baltimore, Maryland, USA

William Satariano, PhD
Professor of Epidemiology and Community Health
University of California, Berkeley
Berkeley, California, USA

Ola Didrik Saugstad, MD, PhD, FRCPE
Professor of Pediatrics
University of Oslo
Director and Department Head
Department of Pediatric Research
University of Oslo and Oslo University Hospital
Oslo, Norway

Charles Shepherd, MB, BS
Honorary Medical Adviser to the ME Association
Buckingham, UK

Christopher R. Snell, PhD
Scientific Director
WorkWell Foundation
Ripon, California, USA

Donald R. Staines, MBBS, MPH, FAFPHM, FAFOEM
Clinical Professor
Menzies Health Institute Queensland
Co-Director, National Centre for Neuroimmunology and Emerging Diseases
Griffith University
Queensland, Australia

Philip B. Stark, PhD
Professor of Statistics
University of California, Berkeley
Berkeley, California, USA

Eleanor Stein, MD, FRCP(C)
Psychiatrist in Private Practice
Assistant Clinical Professor
University of Calgary
Calgary, Alberta, Canada

Staci Stevens, MA
Founder, Exercise Physiologist
Workwell Foundation
Ripon, California, USA

Julian Stewart, MD, PhD
Professor of Pediatrics, Physiology and Medicine
Associate Chairman for Patient Oriented Research
Director, Center for Hypotension
New York Medical College
Hawthorne, NY, USA

Leonie Sugarman, PhD
Emeritus Associate Professor of Applied Psychology
University of Cumbria
Carlisle, UK

John Swartzberg, MD
Clinical Professor Emeritus
School of Public Health
University of California, Berkeley
Berkeley, California, USA

Ronald G. Tompkins, MD, ScD
Summer M Redstone Professor of Surgery
Harvard Medical School
Boston, Massachusetts, USA

David Tuller, DrPH
Lecturer in Public Health and Journalism
University of California, Berkeley
Berkeley, California, USA

Rosemary A. Underhill, MB, BS, MRCOG, FRCSE
Physician and Independent Researcher
Palm Coast, Florida, USA

Rosamund Vallings, MNZM, MB, BS
General Practitioner
Auckland, New Zealand

Michael VanElzakker, PhD
Research Fellow, Psychiatric Neuroscience Division
Harvard Medical School & Massachusetts General Hospital
Instructor, Tufts University Psychology
Boston, Massachusetts, USA

Mark VanNess, PhD
Professor of Health, Exercise & Sports Sciences
University of the Pacific
Stockton, California, USA
Workwell Foundation
Ripon, California, USA

Mark Vink, MD
Family Physician
Soerabaja Research Center
Amsterdam, Netherlands

Frans Visser, MD
Cardiologist
Stichting Cardiozorg
Hoofddorp, Netherlands

Tony Ward, MA (Hons), PhD, DipClinPsyc
Registered Clinical Psychologist
Professor of Clinical Psychology
School of Psychology
Victoria University of Wellington
Wellington, New Zealand
Adjunct Professor, School of Psychology
University of Birmingham
Birmingham, UK
Adjunct Professor, School of Psychology
University of Kent
Canterbury, UK

William Weir, FRCP
Infectious Disease Consultant
London, UK

John Whiting, MD
Specialist Physician
Private Practice
Brisbane, Australia

Carolyn Wilshire, PhD
Senior Lecturer
School of Psychology
Victoria University of Wellington
Wellington, New Zealand

Michael Zeineh, MD, PhD
Assistant Professor
Department of Radiology
Stanford University
Stanford, California, USA

Marcie Zinn, PhD
Research Consultant in Experimental Electrical Neuroimaging and Statistics
Center for Community Research
DePaul University
Chicago, Illinois, USA
Executive Director
Society for Neuroscience and Psychology in the Performing Arts
Dublin, California, USA

Mark Zinn, MM
Research Consultant in Experimental Electrophysiology
Center for Community Research
DePaul University
Chicago, Illinois, USA

 

ME/CFS Patient Organizations

25% ME Group
UK

Emerge Australia
Australia

European ME Alliance:

Belgium ME/CFS Association
Belgium

ME Foreningen
Denmark

Suomen CFS-Yhdistys
Finland

Fatigatio e.V.
Germany

Het Alternatief
Netherlands

Icelandic ME Association
Iceland

Irish ME Trust
Ireland

Associazione Malati di CFS
Italy

Norges ME-forening
Norway

Liga SFC
Spain

Riksföreningen för ME-patienter
Sweden

Verein ME/CFS Schweiz
Switzerland

Invest in ME Research
UK

Hope 4 ME & Fibro Northern Ireland
UK

Irish ME/CFS Association
Ireland

Massachusetts CFIDS/ME & FM Association
USA

ME Association
UK

ME/cvs Vereniging
Netherlands

National ME/FM Action Network
Canada

New Jersey ME/CFS Association
USA

Pandora Org
USA

Phoenix Rising
International membership representing many countries

Solve ME/CFS Initiative
USA

Tymes Trust (The Young ME Sufferers Trust)
UK

Wisconsin ME and CFS Association
USA

[1] White PD, Goldsmith K, Johnson AL, et al. 2013. Recovery from chronic fatigue syndrome after treatments given in the PACE trial. Psychological Medicine 43(10): 2227-2235.

[2] White PD, Goldsmith KA, Johnson AL, et al. 2011. Comparison of adaptive pacing therapy, cognitive behaviour therapy, graded exercise therapy, and specialist medical care for chronic fatigue syndrome (PACE): a randomised trial. The Lancet 377: 823–836

[3] Racaniello V. 2016. An open letter to The Lancet, again. Virology Blog, 10 Feb. Available at: https://www.virology.ws/2016/02/10/open-letter-lancet-again/ (accessed on 2/24/17).

[4] White PD, Sharpe MC, Chalder T, et al. 2007. Protocol for the PACE trial: a randomised controlled trial of adaptive pacing, cognitive behaviour therapy, and graded exercise, as supplements to standardised specialist medical care versus standardised specialist medical care alone for patients with the chronic fatigue syndrome/myalgic encephalomyelitis or encephalopathy. BMC Neurology 7: 6.

[5] Tuller D. 2015. Trial by error: the troubling case of the PACE chronic fatigue syndrome trial. Virology Blog, 21-23 Oct. Available at: https://www.virology.ws/2015/10/21/trial-by-error-i/ (accessed on 2/24/17)

[6] Wilshire C, Kindlon T, Matthees A, McGrath S. 2016. Can patients with chronic fatigue syndrome really recover after graded exercise or cognitive behavioural therapy? A critical commentary and preliminary re-analysis of the PACE trial. Fatigue: Biomedicine, Health & Behavior; published online 14 Dec. Available at: http://www.tandfonline.com/doi/full/10.1080/21641846.2017.1259724 (accessed on 2/24/17)

[7] PACE Participants Newsletter. December 2008. Issue 3. Available at: http://www.wolfson.qmul.ac.uk/images/pdfs/participantsnewsletter3.pdf (accessed on 2/24/17).

[8] Sharpe M, Chalder T, Johnson AL, et al. 2017. Do more people recover from chronic fatigue syndrome with cognitive behaviour therapy or graded exercise therapy than with other treatments? Fatigue: Biomedicine, Health & Behavior; published online 15 Feb. Available at: http://www.tandfonline.com/doi/full/10.1080/21641846.2017.1288629 (accessed on 2/24/17).

[9] Green CR, Cowan P, Elk R. 2015. National Institutes of Health Pathways to Prevention Workshop: Advancing the research on myalgic encephalomyelitis/chronic fatigue syndrome. Annals of Internal Medicine 162: 860-865.

Filed Under: Commentary, Information Tagged With: adaptive pacing therapy, CFS, chronic fatigue syndrome, clinical trial, cognitive behavior therapy, Dave Tuller, exercise, graded exercise therapy, mecfs, myalgic encephalomyelitis, outcome, PACE trial, recovery, therapy

Trial By Error, Continued: A Follow-Up Post on FITNET-NHS

28 November 2016 by Vincent Racaniello

By David Tuller, DrPH

David Tuller is academic coordinator of the concurrent masters degree program in public health and journalism at the University of California, Berkeley.

Last week’s post on FITNET-NHS and Esther Crawley stirred up a lot of interest. I guess people get upset when researchers cite shoddy “evidence” from poorly designed trials to justify foisting psychological treatments on kids with a physiological disease. I wanted to post some additional bits and pieces related to the issue.

*****

I sent Dr. Crawley a link to last week’s post, offering her an opportunity to send her response to Dr. Racaniello for posting on Virology Blog, along with my response to her response. So far, Dr. Racaniello and I haven’t heard back—I doubt we will. Maybe she feels more comfortable misrepresenting facts in trial protocols and radio interviews than in addressing the legitimate concerns raised by patients and confronting the methodological flaws in her research. I hope Dr. Crawley knows she will always have a place on Virology Blog to present her perspective, should she choose to exercise that option. (Esther, are you reading this?)

*****

From reading the research of the CBT/GET/PACE crowd, I get the impression they are all in the habit of peer-reviewing and supporting each others’ work. I make that assumption because it is hard to imagine that independent scientists not affiliated with this group would overlook all the obvious problems that mar their studies—like outcome measures that represent worse health than entry criteria, as in the PACE trial itself. So it’s not surprising to learn that one of the three principal PACE investigators, psychiatrist Michael Sharpe, was on the committee that reviewed—and approved—Dr. Crawley’s one-million-pound FITNET-NHS study.

FITNET-NHS is being funded by the U.K.’s National Institute for Health Research. I have no idea what role, if any, Dr. Sharpe played in pushing through Dr. Crawley’s grant, but it likely didn’t hurt that the FITNET-NHS protocol cited PACE favorably while failing to point out that it has been rejected as fatally flawed by dozens of distinguished scientists and clinicians. Of course, the protocol also failed to point out that the reanalyses of the trial data have shown that the findings published by the PACE authors were much better than the results using the methods they promised in their protocol. (More on the reanalyses below.) And as I noted in my previous post, the FITNET-NHS protocol also misstated the NICE guidelines for chronic fatigue syndrome, making post-exertional malaise an optional symptom rather than a required component—thus conflating chronic fatigue and chronic fatigue syndrome, just as the PACE authors did by using the overly broad Oxford criteria.

The FITNET-NHS proposal also didn’t note some similarities between PACE and the Dutch FITNET trial on which it is based. Like the PACE trial, the Dutch relied on a post-hoc definition of “recovery.” The thresholds the FITNET investigators selected after they saw the results were pretty lax, which certainly made it easier to find that participants had attained “recovery.” Also like the PACE trial, the Dutch participants in the comparison group ended up in the same place as the intervention group at long-term follow-up. Just as the CBT and GET in PACE offered no extended advantages, the same was true of the online CBT provided in FITNET.

And again like the PACE authors, the FITNET investigators downplayed these null findings in their follow-up paper. In a clinical trial, the primary results are supposed to be comparisons between the groups. Yet in the follow-up PACE and FITNET articles, both teams highlighted the “within-group” comparisons. That is, they treated the fact that there were no long-term differences between the groups as an afterthought and boasted instead that the intervention groups sustained the progress they initially made. That might be an interesting sub-finding, but to present “within-group” results as a clinical trial’s main outcome is highly disingenuous.

*****

As part of her media blitz for the FITNET-NHS launch, Dr. Crawley was interviewed on a BBC radio program by a colleague, Dr. Phil Hammond. In this interview, she made some statements that demonstrate one of two things: Either she doesn’t know what she’s talking about and her misrepresentations are genuine mistakes, or she’s lying. So either she’s incompetent, or she lacks integrity. Not a great choice.

Let’s parse what she said about the fact that, at long-term follow-up, there were no apparent differences between the intervention and the comparison groups in the Dutch FITNET study. Here’s her comment:

“Oh, people have really made a mistake on this,” said Dr. Crawley. “So, in the FITNET Trial, they were offered FITNET or usual care for six months, and then if they didn’t make a recovery in the usual care, they were offered FITNET again, and they were then followed up at 2 to 3 years, so of course what happened is that a lot of the children who were in the original control arm, then got FITNET as well, so it’s not surprising that at 2 or 3 years, the results were similar.”

This is simply not an accurate description. As Dr. Crawley must know, some of the Dutch FITNET participants in the “usual care” comparison group went on to receive FITNET, and others didn’t. Both sets of usual care participants—not just those who received FITNET—caught up to the original FITNET group. For Dr. Crawley to suggest that the reason the others caught up was that they received FITNET is, perhaps, an unfortunate mistake. Or else it’s a deliberate untruth.

*****

Another example from the BBC radio interview: Dr. Crawley’s inaccurate description of the two reanalyses of the raw trial data from the PACE study. Here’s what she said:

“First of all they did a reanalysis of recovery based on what the authors originally said they were going to do, and that reanalysis done by the authors is entirely consistent with their original results. [Actually, Dr. Crawley is mistaken here; the PACE authors did a reanalysis of “improvement,” not of “recovery”]…Then the people that did the reanalysis did it again, using a different definition of recovery, that was much much harder to reach–and the trial just wasn’t big enough to show a difference, and they didn’t show a difference. [Here, Dr. Crawley is talking about the reanalysis done by patients and academic statisticians.] Now, you know, you can pick and choose how you redefine recovery, and that’s all very important research, but the message from the PACE Trial is not contested; the message is, if you want to get better, you’re much more likely to get better if you get specialist treatment.”

This statement is at serious odds with the facts. Let’s recap: In reporting their findings in The Lancet in 2011, the PACE authors presented “improvement” results for the two primary outcomes of fatigue and physical function. They reported that about 60 percent of participants in the CBT and GET arms reached the selected thresholds for “improvement” on both measures. In a 2013 paper in the journal Psychological Medicine, they presented “recovery” results based on a composite “recovery” definition that included the two primary outcomes and two additional measures. In this paper, they reported “recovery” rates for the favored intervention groups of 22 percent.

Using the raw trial data that the court ordered them to release earlier this year, the PACE authors themselves reanalyzed the Lancet improvement findings, based on their own initial, more stringent definition of “improvement” in the protocol. In this analysis, the authors reported that only about 20 percent “improved” on both measures, using the methods for assessing “improvement” outlined in the protocol. In other words, only a third as many “improved,” according to the authors’ own original definition, compared to the 60 percent they reported in The Lancet. Moreover, in the reanalysis, ten percent “improved” in the comparison group, meaning that CBT and GET led to “improvements” in only one in ten participants—a pretty sad result for a five-million-pound trial.

However, because these meager findings were statistically significant, the PACE authors and their followers have, amazingly, trumpeted them as supporting their initial claims. In reality, the new “improvement” findings demonstrate that any “benefits” offered by CBT and GET are marginal. It is preposterous and insulting to proclaim, as the PACE authors and Dr. Crawley have, that this represents confirmation of the results reported in The Lancet. Dr. Crawley’s statement that “the message from the PACE trial is not contested” is of course nonsense. The PACE “message” has been exposed as bullshit—and everyone knows it.

The PACE authors did not present their own reanalysis of the “recovery” findings—probably because those turned out to be null, as was shown in a reanalysis of that data by patients and academic statisticians, published on Virology Blog. That reanalysis found single-digit “recovery” rates for all the study arms, and no statistically significant differences between the groups. Dr. Crawley declared in the radio interview that this reanalysis used “a different definition of recovery, that was much harder to reach.” And she acknowledged that the reanalysis “didn’t show a difference”—but she blamed this on the fact that the PACE trial wasn’t big enough, even though it was the largest trial ever of treatments for ME/CFS.

This reasoning is specious. Dr. Crawley is ignoring the central point: The “recovery” reanalysis was based on the authors’ own protocol definition of “recovery,” not some arbitrarily harsh criteria created by outside agitators opposed to the trial. The PACE authors themselves had an obligation to provide the findings they promised in their protocol; after all, that’s the basis on which they received funding and ethical permission to proceed with the trial.

It is certainly understandable why they, and Dr. Crawley, prefer the manipulated and false “recovery” data published in Psychological Medicine. But deciding post-hoc to use weaker outcome measures and then refuse to provide your original results is not science. That’s data manipulation. And if this outcome-switching is done with the intent to hide poor results in favor of better ones, it is considered scientific misconduct.

*****

I also want to say a few words about the leaflet promoting FITNET-NHS. The leaflet states that most patients “recover” with “specialist treatment” and less than ten percent “recover” from standard care. Then it announces that this “specialist treatment” is available through the trial—implicitly promising that most of those who get the therapy will be cured.

This is problematic for a host of reasons. As I pointed out in my previous post, any claims that the Dutch FITNET trial, the basis for Dr. Crawley’s study, led to “recovery” must be presented with great caution and caveats. Instead, the leaflet presents such “recovery” as an uncontested fact. Also, the whole point of clinical trials is to find out if treatments work—in this case, whether the online CBT approach is effective, as well as cost-effective. But the leaflet is essentially announcing the result–“recovery”—before the trial even starts. If Dr. Crawley is so sure that this treatment is effective in leading to “recovery,” why is she doing the trial in the first place? And if she’s not sure what the results will be, why is she promising “recovery”?

Finally, as has been pointed out many times, the PACE investigators, Dr. Crawley and their Dutch colleagues all appear to believe that they can claim “recovery” based solely on subjective measures. Certainly any definition of “recovery” should require that participants can perform physically at their pre-sickness level. However, the Dutch researchers refused to release the one set of data—how much participants moved, as assessed by ankle monitors called actometers–that would have proven that the kids in FITNET had “recovered” on an objective measure of physical performance. The refusal to publish this data is telling, and leaves room for only one interpretation: The Dutch data showed that participants did no better than before the trial, or perhaps even worse, on this measure of physical movement.

This FITNET-NHS leaflet should be withdrawn because of its deceptive approach to promoting the chances of “recovery” in Dr. Crawley’s study. I hope the advertising regulators in the U.K. take a look at this leaflet and assess whether it accurately represents the facts.

*****

As long as we’re talking about the Dutch members of the CBT/GET ideological movement, let’s also look briefly at another piece of flawed research from that group. Like the PACE authors and Dr. Crawley, these investigators have found ways to mix up those with chronic fatigue and those with chronic fatigue syndrome. A case in point is a 2001 study that has been cited in systematic reviews as evidence for the effectiveness of CBT in this patient population. (Dr. Bleijenberg, a co-investigator on the FITNET-NHS trial, was also a co-author of this study.)

In this 2001 study, published in The Lancet (of course!), the Dutch researchers described their case definition for identifying participants like this: “Patients were eligible for the study if they met the US Centers for Disease Control and Prevention criteria for CFS, with the exception of the criterion requiring four of eight additional symptoms to be present.”

This statement is incoherent. (Why do I need to keep using words like “incoherent” and “preposterous” when describing this body of research?) The CDC definition has two main components: 1) six months of unexplained fatigue, and 2) four of eight other symptoms. If you abandon the second component, you can no longer refer to this as meeting the CDC definition. All you’re left with is the requirement that participants have suffered from six months of fatigue.

And that, of course, is the case definition known as the Oxford criteria, developed by PACE investigator Michael Sharpe in the 1990s. And as last year’s seminal report from the U.S. National Institutes of Health suggested, this case definition is so broad that it scoops up many people with fatiguing illnesses who do not have the disease known as ME/CFS. According to the NIH report, the Oxford criteria can “impair progress and cause harm,” and should therefore be “retired” from use. The reason is that any results could not accurately be extrapolated to people with ME/CFS specifically. This is especially so for treatments, such as CBT and GET, that are likely to be effective for many people suffering from other fatiguing illnesses.

In short, to cite any findings from such studies as evidence for treatments for ME/CFS is unscientific and completely unjustified. The 2001 Dutch study might be an excellent look at the use of CBT for chronic fatigue*. But like FITNET-NHS, it is not a legitimate study of people with chronic fatigue syndrome, and the Dutch Health Council should acknowledge this fact in its current deliberations about the illness.

*In the original phrasing, I referred to the intervention mistakenly as ‘online CBT.’

Filed Under: Commentary, Information Tagged With: chronic fatigue syndrome, clinical trial, cognitive behavioral therapy, FITNET-NHS, graded exercise therapy, mecfs, PACE

Trial By Error, Continued: The Real Data

22 September 2016 by Vincent Racaniello

by David Tuller, DrPH

David Tuller is academic coordinator of the concurrent masters degree program in public health and journalism at the University of California, Berkeley.

‘The PACE trial is a fraud.’ Ever since Virology Blog posted my 14,000-essord investigation of the PACE trial last October, I’ve wanted to write that sentence. (I should point out that Dr. Racaniello has already called the PACE trial a “sham,” and I’ve already referred to it as “doggie-poo.” I’m not sure that “fraud” is any worse. Whatever word you use, the trial stinks.)

Let me be clear: I don’t mean “fraud” in the legal sense—I’m not a lawyer–but in the sense that it’s a deceptive and morally bankrupt piece of research. The investigators made dramatic changes from the methodology they outlined in their protocol, which allowed them to report purported “results” that were much, much better than those they would have been able to claim under their originally planned methods. Then they reported only the better-looking “results,” with no sensitivity analyses to analyze the impact of the changes—the standard statistical approach in such circumstances.

This is simply not allowed in science. It means the reported benefits for cognitive behavior therapy and graded exercise therapy were largely illusory–an artifact of the huge shifts in outcome assessments the authors introduced mid-trial. (That’s putting aside all the other flaws, like juicing up responses with a mid-trial newsletter promoting the interventions under investigation, failing to obtain legitimate informed consent from the participants, etc.)

That PACE suffered from serious methodological deficiencies should have been obvious to anyone who read the studies. That includes the reviewers for The Lancet, which published the PACE results for “improvement” in 2011 after what editor Richard Horton has called “endless rounds of peer-review,” and the journal Psychological Medicine, which published results for “recovery” in 2013. Certainly the deficiencies should have been obvious to anyone who read the trenchant letters and commentaries that patients routinely published in response to the egregious errors committed by the PACE team. Even so, the entire U.K. medical, academic and public health establishments refused to acknowledge what was right before their eyes, finding it easier instead to brand patients as unstable, anti-science, and possibly dangerous.

Thanks to the efforts of the incredible Alem Matthees, a patient in Perth, Australia, the U.K.’s First-Tier Tribunal last month ordered the liberation of the PACE trial data he’d requested under a freedom-of-information request. (The brief he wrote for the April hearing, outlining the case against PACE in great detail, was a masterpiece.) Instead of appealing, Queen Mary University of London, the home institution of lead PACE investigator Peter White, made the right decision. On Friday, September 9, the university announced its intention to comply with the tribunal ruling, and sent the data file to Mr. Matthees. The university has a short window of time before it has to release the data publicly.

I’m guessing that QMUL forced the PACE team’s hand by refusing to allow an appeal of the tribunal decision. I doubt that Dr. White and his colleagues would ever have given up their data willingly, especially now that I’ve seen the actual results. Perhaps administrators had finally tired of the PACE shenanigans, recognized that the study was not worth defending, and understood that continuing to fight would further harm QMUL’s reputation. It must be clear to the university now that its own reputational interests diverge sharply from those of Dr. White and the PACE team. I predict that the split will become more apparent as the trial’s reputation and credibility crumble; I don’t expect QMUL spokespeople to be out there vigorously defending the unacceptable conduct of the PACE investigators.

Last weekend, several smart, savvy patients helped Mr. Matthees analyze the newly available data, in collaboration with two well-known academic statisticians, Bruce Levin from Columbia and Philip Stark from Berkeley.  Yesterday, Virology Blog published the group’s findings of the single-digit, non-statistically significant “recovery” rates the trial would have been able to report had the investigators adhered to the methods they outlined in the protocol. That’s a remarkable drop from the original Psychological Medicine paper, which claimed that 22 percent of those in the favored intervention groups achieved “recovery,” compared to seven percent for the non-therapy group.

Now it’s clear: The PACE authors themselves are the anti-science faction. They tortured their data and ended up producing sexier results. Then they claimed they couldn’t share their data because of alleged worries about patient confidentiality and sociopathic anti-PACE vigilantes. The court dismissed these arguments as baseless, in scathing terms. (It should be noted that their ethical concerns for patients did not extend to complying with a critical promise they made in their protocol—to tell prospective participants about “any possible conflicts of interest” in obtaining informed consent. Given this omission, they have no legitimate informed consent for any of their 641 participants and therefore should not be allowed to publish any of their data at all.)

The day before QMUL released the imprisoned data to Mr. Matthees, the PACE authors themselves posted a pre-emptive re-analysis of results for the two primary outcomes of physical function and fatigue, according to the protocol methods. In the Lancet paper, they had revised and weakened their own definition of what constituted “improvement.” With this revised definition, they could report in The Lancetthat approximately 60 % in the cognitive behavior and graded exercise therapy arms “improved” to a clinically significant degree on both fatigue and physical function.

The re-analysis the PACE authors posted last week sought to put the best possible face on the very poor data they were required to release. Yet patients examining the new numbers quickly noted that, under the more stringent definition of “improvement” outlined in the protocol, only about 20 percent in the two groups could be called “overall improvers.”. Solely by introducing a more relaxed definition of “improvement,” the PACE team—enabled by The Lancet’s negligence and an apparently inadequate “endless” review process–was able to triple the trial’s reported success rate..

So now it’s time to ask what happens to the papers already published. The editors have made their feelings clear. I have written multiple e-mails to Lancet editor Richard Horton since I first contacted him about my PACE investigation, almost a year before it ran. He never responded until September 9, the day QMUL liberated the PACE data. Given that the PACE authors’ own analysis showed that the new data showed significantly less impressive results than those published in The Lancet, I sent Dr. Horton a short e-mail asking when we could expect some sort of addendum or correction to the 2011 paper. He responded curtly: “Mr. Tuller–We have no such plans.”

The editors of Psychological Medicine are Kenneth Kendler of Virginia Commonwealth University and Robin Murray of Kings College London. After I wrote to the journal last December, pointing out the problems, I received the following from Dr. Murray, whose home base is KCL’s Department of Psychosis Studies: “Obviously the best way of addressing the truth or otherwise of the findings is to attempt to replicate them. I would therefore like to encourage you to initiate an attempted replication of the study. This would be the best way for you to contribute to the debate…Should you do this, then Psychological Medicine will be most interested in the findings either positive or negative.”

This was not an appropriate response. I told Dr. Murray it was “disgraceful,” given that the paper was so obviously flawed. This week, I wrote again to Dr. Murray and Dr. Kendler, asking if they now planned to deal with the paper’s problems, given the re-analysis by Matthees et al. In response, Dr. Murray suggested that I submit a re-analysis, based on the released data, and Psychological Medicine would be happy to consider it. “We would, of course, send it out to referees for scientific scrutiny in the same manner as we did for the original paper,” he wrote.

I explained that it was his and the journal’s responsibility to address the problems, whether or not anyone submitted a re-analysis. I also noted that I could not improve on the Matthees re-analysis, which completed rebutted the results reported in Psychological Medicine’s paper. I urged Dr. Murray to contact either Dr. Racaniello or Mr. Matthees to discuss republishing it, if he truly wished to contribute to the debate. Finally, I noted that the peer-reviewers for the original paper had okayed a study in which participants could be disabled and recovered simultaneously, so I wasn’t sure if the journal’s assessment process could be trusted.

(By the way, Kings College London, where Dr. Murray is based, is also the home institution of PACE investigator Trudie Chalder as well as Simon Wessely, a close colleague of the PACE authors and president of the Royal College of Psychiatrists*. That could explain Dr. Murray’s inability or reluctance to acknowledge that the “recovery” paper his journal peer-reviewed and published is meaningless.)

Earlier today, the PACE authors posted a blog on The BMJ site, their latest effort to salvage their damaged reputations. They make no mention of their massive research errors and focus only on their supposed fears that releasing even anonymous data will frighten away future research participants. They have provided no evidence to back up this unfounded claim, and the tribunal flatly rejected it. They also state that only researchers who present  “pre-specified” analysis plans should be able to obtain trial data. This is laughable, since Dr. White and his colleagues abandoned their own pre-specified analyses in favor of analyses they decided they preferred much later on, long after the trial started.

They have continued to refer to their reported analyses, deceptively, as “pre-specified,” even though these methods were revised mid-trial. The following point has been stated many times before, but bears repeating: In an open label trial like PACE, researchers are likely to know very well what the outcome trends are before they review any actual data. So the PACE team’s claim that the changes they made were “pre-specified” because they were made before reviewing outcome data is specious. I have tried to ask them about this issue multiple times, and have never received an answer.

Dr. White, his colleagues, and their defenders don’t yet seem to grasp that the intellectual construct they invented and came to believe in—the PACE paradigm or the PACE enterprise or the PACE cult, have your pick—is in a state of collapse. They are used to saying whatever they want about patients—Internet Abuse! Knife-wielding! Death threats!!–and having it be believed. In responding to legitimate concerns and questions, they have covered up their abuse of the scientific process by providing non-answers, evasions and misrepresentations—the academic publishing equivalent of “the dog ate my homework.” Amazingly, journal editors, health officials, reporters and others have accepted these non-responsive responses as reasonable and sufficient. I do not.

Now their work is finally being scrutinized the way it should have been by peer reviewers before this damaging research was ever published in the first place. The fallout is not going to be pretty. If nothing else, they have provided a great gift to academia with their $8 million** disaster—for years to come, graduate students in the U.S., the U.K. and elsewhere will be dissecting PACE as a classic case study of bad research and mass delusion.

*Correction: The original version of the post mistakenly called the organization the Royal Society of Psychiatrists.

**Correction: The original version of the post stated that PACE cost $8 million, not $6.4 million. In fact, PACE cost five million pounds, so the cost in dollars depends on the exchange rate used. The $8 million figure is based on the exchange rate from last October, when Virology Blog published my PACE investigation. But the pound has fallen since the Brexit vote in June, so the cost in dollars at the current exchange rate is lower.

Filed Under: Commentary, Information Tagged With: chronic fatigue syndrome, clinical trial, Freedom of Information, GET, mecfs, myalgic encephalomyelitis, PACE

Zika Virus in the USA

8 August 2016 by Vincent Racaniello

On this episode of Virus Watch we cover three Zika virus stories: the first human trial of a Zika virus vaccine, the first local transmission of infection in the United States, and whether the virus is a threat to participants in the 2016 Summer Olympic and Paralympic Games.

Filed Under: Virus Watch Tagged With: 2016 Olympics, clinical trial, dna vaccine, Miami, microcephaly, phase I trial, Rio de Janeior, viral, virology, virus, viruses, zika virus

TWiV 397: Trial by error

10 July 2016 by Vincent Racaniello

Journalism professor David Tuller returns to TWiV for a discussion of the PACE trial for ME/CFS: the many flaws in the trial, why its conclusions are useless, and why the data must be released and re-examined.

You can find TWiV #397 at microbe.tv/twiv, or listen below.

[powerpress url=”http://traffic.libsyn.com/twiv/TWiV397.mp3″]

Click arrow to play
Download TWiV 397 (67 MB .mp3, 93 min)
Subscribe (free): iTunes, RSS, email, Google Play Music

Become a patron of TWiV!

Filed Under: This Week in Virology Tagged With: adaptive pacing therapy, CFS, chronic fatigue syndrome, clinical trial, cognitive behavior therapy, graded exercise therapy, mecfs, myalgic encephalomyeltiis, PACE trial

A request for data from the PACE trial

17 December 2015 by Vincent Racaniello

Mr. Paul Smallcombe
Records & Information Compliance Manager
Queen Mary University of London
Mile End Road
London E1 4NS

Dear Mr Smallcombe:

The PACE study of treatments for ME/CFS has been the source of much controversy since the first results were published in The Lancet in 2011. Patients have repeatedly raised objections to the study’s methodology and results. (Full title: “Comparison of adaptive pacing therapy, cognitive behaviour therapy, graded exercise therapy, and specialist medical care for chronic fatigue syndrome: a randomized trial.”)

Recently, journalist and public health expert David Tuller documented that the trial suffered from many serious flaws that raise concerns about the validity and accuracy of the reported results. We cited some of these flaws in an open letter to The Lancet that urged the journal to conduct a fully independent review of the trial. (Dr. Tuller did not sign the open letter, but he is joining us in requesting the trial data.)

These flaws include, but are not limited to: major mid-trial changes in the primary outcomes that were not accompanied by the necessary sensitivity analyses; thresholds for “recovery” on the primary outcomes that indicated worse health than the study’s own entry criteria; publication of positive testimonials about trial outcomes and promotion of the therapies being investigated in a newsletter for participants; rejection of the study’s objective outcomes as irrelevant after they failed to support the claims of recovery; and the failure to inform participants about investigators’ significant conflicts of interest, and in particular financial ties to the insurance industry, contrary to the trial protocol’s promise to adhere to the Declaration of Helsinki, which mandates such disclosures.

Although the open letter was sent to The Lancet in mid-November, editor Richard Horton has not yet responded to our request for an independent review. We are therefore requesting that Queen Mary University of London to provide some of the raw trial data, fully anonymized, under the provisions of the U.K.’s Freedom of Information law.

In particular, we would like the raw data for all four arms of the trial for the following measures: the two primary outcomes of physical function and fatigue (both bimodal and Likert-style scoring), and the multiple criteria for “recovery” as defined in the protocol published in 2007 in BMC Neurology, not as defined in the 2013 paper published in Psychological Medicine. The anonymized, individual-level data for “recovery” should be linked across the four criteria so it is possible to determine how many people achieved “recovery” according to the protocol definition.

We are aware that previous requests for PACE-related data have been rejected as “vexatious.” This includes a recent request from psychologist James Coyne, a well-regarded researcher, for data related to a subsequent study about economic aspects of the illness published in PLoS One—a decision that represents a violation of the PLoS policies on data-sharing.

Our request clearly serves the public interest, given the methodological issues outlined above, and we do not believe any exemptions apply. We can assure Queen Mary University of London that the request is not “vexatious,” as defined in the Freedom of Information law, nor is it meant to harass. Our motive is easy to explain: We are extremely concerned that the PACE studies have made claims of success and “recovery” that appear to go beyond the evidence produced in the trial. We are seeking the trial data based solely on our desire to get at the truth of the matter.

We appreciate your prompt attention to this request.

Sincerely,

Ronald W. Davis, PhD
Professor of Biochemistry and Genetics
Stanford University

Bruce Levin, PhD
Professor of Biostatistics
Columbia University

Vincent R. Racaniello, PhD
Professor of Microbiology and Immunology
Columbia University

David Tuller, DrPH
Lecturer in Public Health and Journalism
University of California, Berkeley

Filed Under: Information Tagged With: chronic fatigue syndrome, clinical trial, Declaration of Helsinki, Freedom of Information, mecfs, myalgic encephalomyelitis, PACE, Queen Mary University of London, Richard Horton, The Lancet, trial data request, UK

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Go to Next Page »

Primary Sidebar

by Vincent Racaniello

Earth’s virology Professor
Questions? virology@virology.ws

With David Tuller and
Gertrud U. Rey

Follow

Facebook, Twitter, YouTube, Instagram
Get updates by RSS or Email

Contents

Table of Contents
ME/CFS
Inside a BSL-4
The Wall of Polio
Microbe Art
Interviews With Virologists

Earth’s Virology Course

Virology Live
Columbia U
Virologia en Español
Virology 101
Influenza 101

Podcasts

This Week in Virology
This Week in Microbiology
This Week in Parasitism
This Week in Evolution
Immune
This Week in Neuroscience
All at MicrobeTV

Useful Resources

Lecturio Online Courses
HealthMap
Polio eradication
Promed-Mail
Small Things Considered
ViralZone
Virus Particle Explorer
The Living River
Parasites Without Borders

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.