Category Archives: Systematic Reviews

Adventures in screening systematic reviews

I’m screening full text systematic reviews for an umbrella review and I have seen some atrocious examples. If you follow me on xTwitter, you will have come across some of my rants (I also say hooray when authors get things right). In this post, I want to review some of the common errors I’ve come across in order to help others who are on the review writing train.

  1. Searching in just one database
  2. Listing databases used
  3. The search strategy
  4. PRISMA
  5. Not consulting a librarian

Searching in just one database. A few reviews stated that only one database was searched. One had the audacity to state that no duplicates were found. If only one database is used, this makes it a literature review, NOT a systematic review.

Listing databases used. A common issue is listing platforms and publishers as databases. Another is not specifying what database is being used when a collection of databases is being used. A review I looked at today listed Cochrane and Elsevier in the list of databases. Cochrane is an international organisation and Elsevier is the world’s largest STEM publisher. When you want to indicate that you used databases in the Cochrane Library, specify whether it was the Cochrane Database of Systematic Reviews (CDSR) or CENTRAL. It is important to list what platform was used to search. This is because databases are available on a variety of different platforms and informing you which platform was used increases reproducibility. A few others have listed Web of Science (WoS) as a database. WoS is a platform that contains many databases, including Medline. One review stated they searched PubMed, Medline and WoS (depending on their institution’s subscription, they probably searched Medline). They searched Medline 3 times!

The search strategy. Medline alone contains over 36 million citations. When your citations from all your database searching doesn’t even reach 500 (or is even less), something is seriously wrong with your search strategy. Other issues include very badly crafted search strategies (and I have seen some gawd awful ones, let me tell you). When someone is doing a critical appraisal of a systematic review, one of the first questions is about the search strategy. Is it robust enough to make it worthwhile continuing the appraisal? Many reviews have fallen at the first hurdle. Another strange issue is the number of people running the search strategies. One review stated all four authors ran the search strategy in all databases. Why? They would all get the same results (hopefully)!! Only one person has to run the searches and download all citations.

A mantra to keep in mind: the search strategy is the foundation of the systematic review. A bad foundation undermines the review.

PRISMA. PRISMA is a reporting guideline as per its full title: Preferred Reporting Items for Systematic Reviews and Meta-analysis. It is NOT a handbook or guideline for writing/conducting the review. When referring to PRISMA, please state that it was reported using/according to PRISMA.

Not consulting a librarian. All of these problems could have been avoided if the authors included a librarian on the team. If your institution or organisation has librarians on staff, please consult them! They can be on the sidelines reviewing your strategy and providing advice, right up to full authorship with screening and commenting on draft versions before journal submission. Do you want your review to pass the first test in critical appraisal? Get a librarian on board!!

Writing a systematic review? Don’t use PubMed

The new iteration of PubMed makes it inadvisable for building searches to inform systematic reviews. Why is this? The new version uses machine learning algorithms working behind the scenes which are invisible to the searcher. That means that transparency and reproducibility is no longer possible. Transparency and reproducibility are of key importance in scientific reporting and experiments. Without these present in the search strategy, a systematic review falls at the first hurdle when being critically appraised.

PRISMA-S was launched recently, outlining all the reporting requirements for literature searching in systematic reviews. Item 8 is: Include the search strategies for each database and information source, copied and pasted exactly as run. Note ‘exactly as run’. This is not possible in PubMed. Medline on the OVID platform (or via EBSCO or other aggregator) is preferred.

Searching Medline via a database aggregator platform has been the preferred practice for building and running search strategies for systematic reviews for over two decades now, mostly because of the ability to use proximity operators. Proximity operators are not available in PubMed and there are no plans to introduce them.

So, can you use PubMed at all? You can use it to search PMC articles, which can be useful for surgical (and other) questions. Care still needs to be taken though, and make sure you capture your search before logging off – PubMed no longer stores search histories.

Database filters for Human studies

It is common practice now to use the NOT animals not humans/ search string to exclude animal studies in Medline. But in Embase, such neat filter doesn’t work. Humans used not to be included under the Animal heading in Embase (why? Humans are animals – this is one of my bugbears). Now it is, which makes more sense. However, limiting to human (or human+animal) studies isn’t straightforward.

1 your subject search bottom line
2 exp animal/ or exp invertebrate/ or animal experiment/ or animal model/ or animal tissue/ or animal cell/ or nonhuman/
3 exp human/ or human cell/
4 2 and 3
5 2 not 4
6 1 not 5

exp human has normal human/ as a narrower term and the synonyms suggest that this means a healthy person without disease. nonhuman/ is a curious term – the scope is: “Used for all items on non-human organisms (animals, bacteria, viruses, plants etc.) or on tissue, cells or cell components from such organisms”. Why there is a need to group all these together under nonhuman when you could search for virus or bacteria or frog* (example that comes to mind for no reason I can think of) beats me. Also curious is exp animal/ doesn’t include invertebrates.

Jacqueline Limpens provided another alternative to searching for human studies in Embase, which she posted on the expertsearching e-list:

1 your subject search bottom line
2 (exp animal/ or animal.hw. or nonhuman/) not (exp human/ or human cell/ or (human or humans).ti.)
3 1 not 2

She wrote that for a particular search she was doing, animal.hw. (heading word) also found animal embryo (which gave noise especially in this IVF & embryo transfer-media topic). human or humans in the title field had to be added to avoid losing this relevant paper:
Improved pregnancy rate in human in vitro fertilization with the use of a medium based on the composition of human tubal fluid.
Quinn P., Kerin J.F., Warnes G.M.
Fertility and Sterility. 44 (4) (pp 493-498), 1985. Date of Publication: 1985.
AN: 1986023234

It isn’t indexed with human, but it is with pregnancy and looking at the scope note for pregnancy, it implies human pregnancy BUT it could include any animal pregnancy – confusing! She also mentioned that you should consider other headings that could be indicative of non-human disease, like exp experimental neoplasm/ and xenografts/.

With Embase, it seems a wise idea to scan/search the results before you add any human studies filter to see if there are any relevant titles that could be excluded with it applied (as in the above pregnancy article).