Document Type

Article

Version

Author's Final Manuscript

Publication Title

Children and Youth Services Review

Volume

30

Publication Date

1-1-2008

Abstract

Objective

To assess methods used to identify, analyze, and synthesize results of empirical research on intervention effects, and determine whether published reviews are vulnerable to various sources and types of bias.

Methods

Study 1 examined the methods, sources, and conclusions of 37 published reviews of research on effects of a model program. Study 2 compared findings of one published trial with summaries of results of that trial that appeared in published reviews.

Results

Study 1: Published reviews varied in terms of the transparency of inclusion criteria, strategies for locating relevant published and unpublished data, standards used to evaluate evidence, and methods used to synthesize results across studies. Most reviews relied solely on narrative analysis of a convenience sample of published studies. None of the reviews used systematic methods to identify, analyze, and synthesize results. Study 2: When results of a single study were traced from the original report to summaries in published reviews, three patterns emerged: a complex set of results was simplified, non-significant results were ignored, and positive results were over-emphasized. Most reviews used a single positive statement to characterize results of a study that were decidedly mixed. This suggests that reviews were influenced by confirmation bias, the tendency to emphasize evidence that supports a hypothesis and ignore evidence to the contrary.

Conclusions

Published reviews may be vulnerable to biases that scientific methods of research synthesis were designed to address. This raises important questions about the validity of traditional sources of knowledge about “what works,” and suggests need for a renewed commitment to using scientific methods to produce valid evidence for practice.

DOI

10.1016/j.childyouth.2008.04.001

Included in

Social Work Commons

Share

COinS