<!-- Google tag (gtag.js) -->
<script async src="https://www.googletagmanager.com/gtag/js?id=G-Y0S42W4GLW"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag()

Unknown macro: {dataLayer.push(arguments);}

gtag('js', new Date());

gtag('config', 'G-Y0S42W4GLW');
</script>

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »

An Introduction to Evidence Synthesis Searching

The search for sources of evidence is a fundamental component of all the diverse types and methodologies of systematic and scoping reviews (Munn et al. 2018). Searches must be as comprehensive as possible while recognising that there are limitations to this goal. Searches must be explicit, transparent and reproducible. Readers and users of systematic and scoping reviews must be able to readily identify which steps were taken in the search, including which databases, versions of databases, platforms and resources were searched and when, and which keywords, controlled vocabulary, field tags and limits were used. If search filters (sometimes called search blocks, hedges or strings) are used, these must be cited. Moreover, while not required, it is courtesy to indicate who conducted the search (Ross-White 2021). This information is presented in the manuscript, generally as an appendix.

Searches must also seek to avoid information bias, which is ‘any systematic difference from the truth that arises in the collection, recall, recording and handling of information in a study, including how missing data is dealt with’ (Bankhead, Spencer & Nunan 2019).Bias is often introduced in searches unintentionally, so particular steps must be taken to avoid it. For example, because of the preference for publishing positive studies (publication bias), unpublished trials must be sought out (Anderson & Jayaratne 2015; Lin & Chu 2017; Murad et al. 2018; Sutton et al. 2000). This may be done by including unpublished and in-process literature, such as conference abstracts, theses, government policies, letters, or other grey literature, by searching trial registers for ‘missing’ trials and for yet to be completed studies (Page, Higgins & Sterne 2023) or by communicating directly with researchers in the field of study for their in-process work. It may also be necessary to contact authors for unpublished data related to their published or unpublished works. As an international organisation, we encourage including literature in all languages, and not limiting to English-language studies only. This is called language bias (Brassey, Spencer & Heneghan 2017). This is good practice for all systematic and scoping reviews, as we seek to minimise bias in the publication record and encourage good scholarship internationally. Language bias is not the only bias that affects equity, diversity and inclusion. Care should be taken in considering what and where to search and which research to include. Reviewers must also be mindful that the search terms do not inadvertently introduce bias. This can be done by searching for both negative and positive concepts.

It is difficult to prescribe a particular search methodology, as review questions and resources vary considerably. As an example, while a question relating to the behavioural aspect of illness would require a search in the PsycINFO database, questions that do not relate to behaviour or psychology might not need a search in that database. While PubMed provides free access to MEDLINE, searching MEDLINE on the Ovid platform provides greater functionality for controlling the search process through more sophisticated proximity searching (e.g. ADJn). Many institutions have different subscriptions to databases that are similar in content, so this access to resources may determine which databases are searched. JBI has endorsed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) standards for reporting protocols and reviews, including searches (Moher et al. 2009; Page et al. 2021). Familiarising yourself with the PRISMA, PRISMA-S, PRISMA-ScR and PRISMA-P requirements as well as any relevant PRISMA extensions for reporting at the outset of a review ensures that the necessary documentation will be followed.

The JBI 3-Step Search Process

What differentiates JBI reviews in the search component is our explicit 3-step search process.

The first step is an initial exploratory search (usually of MEDLINE and CINAHL databases) to find ‘seed references’ –– that is, records of studies that meet the inclusion criteria for the review question. Google, Google Scholar, Library discovery tools (PRIMO, OneSearch, SumSearch, etc.) and Generative AI tools (e.g. ChatGPT) can be useful for finding these seed references. However, for sources found via ChatGPT, it is important to verify that they are not hallucinations. Following these initial exploratory searches, the keywords in the title and abstract, along with indexed terms (also referred to as MeSH terms, Thesaurus, Controlled Vocabulary) of the seed references are harvested. Consultation with the review team to gather other keywords and indexed terms should be sought.

The second step is to develop a comprehensive search strategy using these harvested keywords and indexed terms for the key database for the protocol. The development of a comprehensive search is an iterative process that involves testing combinations of keywords and indexed terms and checking that the seed references are being captured by the search strategy. Once the search strategy is developed in the primary database, it is then translated across to other databases and grey literature sources. Some databases, such as Embase, CINAHL and PsycINFO, include grey literature.

The third step is supplementary searching of the grey literature, citation searching and handsearching. The TARCiS Statement (Terminology, Application and Reporting of Citation Searching) provides guidance on when and how to conduct citation searching as a supplementary search technique in evidence synthesis, and importantly, how to report it (Hirt et al. 2024). For review questions that are complex, forward and backward citation searching is considered mandatory.

Authors:

The Joanna Briggs Information Science Methodology Group

Amanda Ross-White, MLIS, AHIP, editor

Michelle Lieggi, MLS, AHIP

Fabiana Gulin Longhi Palacio

Terena Solomons BA Grad Dip Lib Sc. ALIA CP (Health) FHEA

 Michelle Swab, MA, MLIS

Melissa Rothfus, PhD, MLIS

Juliana Takahashi

Daniela Cardoso, PhD, RN

  • No labels