Background: results of systematic reviews (SRs) are often presented at conferences. Examinations of conference abstracts of primary studies have repeatedly shown that information presented in abstracts is often not well reported and/or not dependable. These issues have not been examined as much in abstracts describing SRs.
Objectives: we conducted an investigation to:
1) quantify the discordance between results presented in SR abstracts at five recent World Congresses on Pain (WCP; a premier pain-related research and clinical conference) and their corresponding full-length publications; and
2) determine the reporting quality of those abstracts.
Methods: we screened abstracts of the five most recent WCPs held biennially from 2008 to 2016 to identify abstracts describing SRs. Two investigators independently searched for corresponding full publications using PubMed and Google Scholar in April 2018. We extracted data about study aims, main outcome, methods, and conclusions from abstracts and compared them with their corresponding publications. We evaluated the reporting quality of abstracts against the individual items of the PRISMA for Abstracts (PRISMA-A) reporting guideline.
Results: we included 143 conference abstracts describing SRs. Of those, 90 abstracts (63%) were published in full in a peer-reviewed journal by April 2018, with a median time of five months (interquartile range (IQR) −0.25 to 14 months) from conference presentation to full publication. Among the 90 abstract-publication pairs, we found some form of discordance in 38 pairs (42%). Qualitative discordance (i.e. different directions of treatment effect in abstract and publication) was found in 15 pairs (17%). Reporting quality in abstracts was suboptimal; the median item-specific adherence across all PRISMA-A items for all conference abstracts was 33% (IQR 29% to 42%), with lowest adherence for items pertaining to registration and funding (1% and 8%, respectively) and highest adherence for the item pertaining to interpretation of the review’s results (96%). In abstracts of full-length publications, the item-specific adherence across all PRISMA-A items were very similar to the conference abstracts (median = 33%; IQR 25% to 42%).
Conclusions: conference abstracts describing SRs in the field of pain are often not reliable sources of information and their reporting quality is suboptimal. Almost 40% of SR abstracts are not fully published in a journal up to 10 years after conference presentation. Interventions for improving adherence to SR abstract reporting guidelines are warranted. When there is discordance between data in the abstract and the publication, journals and SR authors should also be transparent about reasons for the discordance (e.g. data in the abstract were preliminary).
Patient or healthcare consumer involvement: we did not involve patients or healthcare consumers in the conduct of this study, but many SRs are developed with their active involvement.