Major medical journals dont follow their own rules for reporting results from

first_img Country * Afghanistan Aland Islands Albania Algeria Andorra Angola Anguilla Antarctica Antigua and Barbuda Argentina Armenia Aruba Australia Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia, Plurinational State of Bonaire, Sint Eustatius and Saba Bosnia and Herzegovina Botswana Bouvet Island Brazil British Indian Ocean Territory Brunei Darussalam Bulgaria Burkina Faso Burundi Cambodia Cameroon Canada Cape Verde Cayman Islands Central African Republic Chad Chile China Christmas Island Cocos (Keeling) Islands Colombia Comoros Congo Congo, the Democratic Republic of the Cook Islands Costa Rica Cote d’Ivoire Croatia Cuba Curaçao Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Ethiopia Falkland Islands (Malvinas) Faroe Islands Fiji Finland France French Guiana French Polynesia French Southern Territories Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guatemala Guernsey Guinea Guinea-Bissau Guyana Haiti Heard Island and McDonald Islands Holy See (Vatican City State) Honduras Hungary Iceland India Indonesia Iran, Islamic Republic of Iraq Ireland Isle of Man Israel Italy Jamaica Japan Jersey Jordan Kazakhstan Kenya Kiribati Korea, Democratic People’s Republic of Korea, Republic of Kuwait Kyrgyzstan Lao People’s Democratic Republic Latvia Lebanon Lesotho Liberia Libyan Arab Jamahiriya Liechtenstein Lithuania Luxembourg Macao Macedonia, the former Yugoslav Republic of Madagascar Malawi Malaysia Maldives Mali Malta Martinique Mauritania Mauritius Mayotte Mexico Moldova, Republic of Monaco Mongolia Montenegro Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands New Caledonia New Zealand Nicaragua Niger Nigeria Niue Norfolk Island Norway Oman Pakistan Palestine Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Qatar Reunion Romania Russian Federation Rwanda Saint Barthélemy Saint Helena, Ascension and Tristan da Cunha Saint Kitts and Nevis Saint Lucia Saint Martin (French part) Saint Pierre and Miquelon Saint Vincent and the Grenadines Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia Seychelles Sierra Leone Singapore Sint Maarten (Dutch part) Slovakia Slovenia Solomon Islands Somalia South Africa South Georgia and the South Sandwich Islands South Sudan Spain Sri Lanka Sudan Suriname Svalbard and Jan Mayen Swaziland Sweden Switzerland Syrian Arab Republic Taiwan Tajikistan Tanzania, United Republic of Thailand Timor-Leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks and Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates United Kingdom United States Uruguay Uzbekistan Vanuatu Venezuela, Bolivarian Republic of Vietnam Virgin Islands, British Wallis and Futuna Western Sahara Yemen Zambia Zimbabwe Click to view the privacy policy. Required fields are indicated by an asterisk (*) A study finds papers describing the results of clinical trials often fail to properly report outcomes. By Jocelyn KaiserFeb. 15, 2019 , 4:45 PM Sign up for our daily newsletter Get more great content like this delivered right to you! Country ISTOCK.COM/PEOPLEIMAGES center_img Email Major medical journals don’t follow their own rules for reporting results from clinical trials It’s a well-known problem with clinical trials: Researchers start out saying they will look for a particular outcome—heart attacks, for example—but then report something else when they publish their results. That practice can make a drug or treatment look like it’s safer or more effective than it actually is. Now, a systematic effort to find out whether major journals are complying with their own pledge to ensure that outcomes are reported correctly has found many are falling down on the job—and both journals and authors are full of excuses.When journals and researchers were asked to correct studies, the responses “were fascinating, and alarming. Editors and researchers routinely misunderstand what correct trial reporting looks like,” says project leader Ben Goldacre, an author and physician at the University of Oxford in the United Kingdom and a proponent of transparency in drug research.Starting 4 years ago, his team’s Centre for Evidence-Based Medicine Outcome Monitoring Project (COMPare) project examined all trials published over 6 weeks in five journals: Annals of Internal Medicine, The BMJ, JAMA, The Lancet, and The New England Journal of Medicine (NEJM). The study topics ranged from the health effects of drinking alcohol for diabetics to a comparison of two kidney cancer drugs. All five journals have endorsed long-established Consolidated Standards of Reporting Trials (CONSORT) guidelines. One CONSORT rule is that authors should describe the outcomes they plan to study before a trial starts and stick to that list when they publish the trial. But only nine of 67 trials published in the five journals reported outcomes correctly, the COMPare team reported on 14 February in the journal Trials. One-fourth didn’t correctly report the primary outcome they set out to measure and 45% didn’t properly report all secondary outcomes; others added new outcomes. (This varied by journal: Only 44% of trials in Annals correctly reported the primary outcome, compared with 96% of NEJM trials.)When the COMPare team wrote the journals about the problematic papers, only 23 of the 58 letters were published. Annals and The BMJ published all of them, The Lancet accepted 80%, and NEJM and JAMA rejected them all. NEJM editors explained that their editors and peer reviewers decide which outcomes will be reported. Although some of the CONSORT rules are “useful,” they wrote, authors aren’t required to comply. Other editors didn’t seem to understand that trial researchers can switch outcomes if they disclose the change. JAMA and NEJM said they didn’t always have space to publish all outcomes.When trial authors responded to the letters that did make it into print, their comments were full of “inaccurate or problematic statements and misunderstandings,” the COMPare team found in a companion paper in Trials. Like editors, many authors misunderstood the CONSORT rules, as well as the role of public registries for sharing a trial’s plan. Some attacked the COMPare project as “outside the research community.” Others brushed off the criticisms, grumbling about how difficult their work was. Still others denied that they left out any outcomes, the authors state.The COMPare team writes that it hopes journals will be inspired to better enforce CONSORT and revisit their standards for publishing letters. “We hope that editors will respond positively, constructively, and thoughtfully to our findings,” Goldacre says.last_img

Leave a Reply

Your email address will not be published. Required fields are marked *