-
Title
-
Researching Readiness for Implementation of Evidence-Based Practice: A Comprehensive Review of the Evidence-Based Practice Attitude Scale (EBPAS)
-
Abstract/Description
-
This chapter considers the role of experiments in regard to evaluation and action research in determining whether change in educational contexts can be attributed to the introduction of an intervention approach or programme. It explains the three classes of experimental design: 'true' experimental designs, such as randomised, controlled trials (RCTs), 'quasi-experimental' designs, and also 'small-n' experimental designs. True' and 'quasi-experiments' are 'studies of deliberate intervention. In the case of a 'true' experimental design such as an RCT, allocation to groups must be random. 'Small- n' experimental designs involve the manipulation of an independent (treatment) variable across a pre-intervention baseline phase, an intervention phase, and commonly a post-intervention phase. Most published studies of the effectiveness of school-based interventions use quasi-experimental designs. It is important for educational researchers to be aware of the beliefs and values of participants and stakeholders and indeed of underlying policy contexts and political realities.
-
Date
-
2012
-
In publication
-
Handbook of Implementation Science for Psychology in Education
-
Editor
-
Kelly, Barbara
-
Perkins, Daniel F.
-
Pages
-
150-164
-
Publisher
-
Cambridge University Press
-
Open access/full-text available
-
en
No
-
Peer reviewed
-
en
Yes
-
ISBN
-
978-0-521-19725-0
-
Citation
-
Aarons, G. A., Green, A. E., & Miller, E. (2012). Researching Readiness for Implementation of Evidence-Based Practice: A Comprehensive Review of the Evidence-Based Practice Attitude Scale (EBPAS). In B. Kelly & D. F. Perkins (Eds.), Handbook of Implementation Science for Psychology in Education (pp. 150–164). Cambridge University Press. https://doi.org/10.1017/CBO9781139013949.013
-
Place
-
Cambridge
Comments
No comment yet! Be the first to add one!