Adventures in screening systematic reviews

I’m screening full text systematic reviews for an umbrella review and I have seen some atrocious examples. If you follow me on xTwitter, you will have come across some of my rants (I also say hooray when authors get things right). In this post, I want to review some of the common errors I’ve come across in order to help others who are on the review writing train.

  1. Searching in just one database
  2. Listing databases used
  3. The search strategy
  4. PRISMA
  5. Not consulting a librarian

Searching in just one database. A few reviews stated that only one database was searched. One had the audacity to state that no duplicates were found. If only one database is used, this makes it a literature review, NOT a systematic review.

Listing databases used. A common issue is listing platforms and publishers as databases. Another is not specifying what database is being used when a collection of databases is being used. A review I looked at today listed Cochrane and Elsevier in the list of databases. Cochrane is an international organisation and Elsevier is the world’s largest STEM publisher. When you want to indicate that you used databases in the Cochrane Library, specify whether it was the Cochrane Database of Systematic Reviews (CDSR) or CENTRAL. It is important to list what platform was used to search. This is because databases are available on a variety of different platforms and informing you which platform was used increases reproducibility. A few others have listed Web of Science (WoS) as a database. WoS is a platform that contains many databases, including Medline. One review stated they searched PubMed, Medline and WoS (depending on their institution’s subscription, they probably searched Medline). They searched Medline 3 times!

The search strategy. Medline alone contains over 36 million citations. When your citations from all your database searching doesn’t even reach 500 (or is even less), something is seriously wrong with your search strategy. Other issues include very badly crafted search strategies (and I have seen some gawd awful ones, let me tell you). When someone is doing a critical appraisal of a systematic review, one of the first questions is about the search strategy. Is it robust enough to make it worthwhile continuing the appraisal? Many reviews have fallen at the first hurdle. Another strange issue is the number of people running the search strategies. One review stated all four authors ran the search strategy in all databases. Why? They would all get the same results (hopefully)!! Only one person has to run the searches and download all citations.

A mantra to keep in mind: the search strategy is the foundation of the systematic review. A bad foundation undermines the review.

PRISMA. PRISMA is a reporting guideline as per its full title: Preferred Reporting Items for Systematic Reviews and Meta-analysis. It is NOT a handbook or guideline for writing/conducting the review. When referring to PRISMA, please state that it was reported using/according to PRISMA.

Not consulting a librarian. All of these problems could have been avoided if the authors included a librarian on the team. If your institution or organisation has librarians on staff, please consult them! They can be on the sidelines reviewing your strategy and providing advice, right up to full authorship with screening and commenting on draft versions before journal submission. Do you want your review to pass the first test in critical appraisal? Get a librarian on board!!

Leave a comment