Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2001;84(Pt 1):390-3.

Enhancing retrieval of best evidence for health care from bibliographic databases: calibration of the hand search of the literature

Affiliations
  • PMID: 11604770

Enhancing retrieval of best evidence for health care from bibliographic databases: calibration of the hand search of the literature

N L Wilczynski et al. Stud Health Technol Inform. 2001.

Abstract

Background: Medical practitioners have unmet information needs. Health care research dissemination suffers from both "supply" and "demand" problems. One possible solution is to develop methodologic search filters ("hedges") to improve the retrieval of clinically relevant and scientifically sound study reports from bibliographic databases. To develop and test such filters a hand search of the literature was required to determine directly which articles should be retrieved, and which not retrieved.

Objective: To determine the extent to which 6 research associates can agree on the classification of articles according to explicit research criteria when hand searching the literature.

Design: Blinded, inter-rater reliability study.

Setting: Health Information Research Unit, McMaster University, Hamilton, Ontario, Canada.

Participants: 6 research associates with extensive training and experience in research methods for health care research.

Main outcome measure: Inter-rater reliability measured using the kappa statistic for multiple raters.

Results: After one year of intensive calibration exercises research staff were able to attain a level of agreement at least 80% greater than that expected by chance (kappa statistic) for all classes of articles.

Conclusion: With extensive training multiple raters are able to attain a high level of agreement when classifying articles in a hand search of the literature.

PubMed Disclaimer

Similar articles

Cited by

Publication types

MeSH terms

LinkOut - more resources

-