Journal of Research & Opinion peer-reviewed open access journal

Effect Sizes for Single Case Research Design Studies: Why Use them and are they Helpful?

John W. Maag
University of Nebraska-Lincoln
Share:

How to Cite

1.
Effect Sizes for Single Case Research Design Studies: Why Use them and are they Helpful?. Journal of Research and Opinion [Internet]. 2022 Feb. 15 [cited 2024 Sep. 6];9(2):3093-100. Available from: https://researchopinion.in/index.php/jro/article/view/147
  • Articles
  • Submited: February 15, 2022
  • Published: February 15, 2022

Abstract

The debate about using effect size calculations for single case research design (SCRD) studies has been ongoing since they were first developed approximately 35 years ago. There have been advancements in methods for calculating them since the percentage of non-overlapping data (PND) was published. Further, mostjournals that publish SCRD studies require any systematic review to include effect sizes,hence becoming a meta-analysis. Although meta-analyses using effect size calculations are common in any systematic review of group design studies, the question remains for individual SCRD studies and systematic reviews whether they are useful and, if so, what information do they add to the analysis of a study or group of studies that visual inspection does not already provide. The purpose of this article is to present some background on the development and evolution of effect size calculations for SCRD studies by considering how they relate to an intervention being considered evidence-based, quality of studies, and argue that they do not add any robust analysis to individual studies nor systematic reviews of this type of research.

References

Cohen, J. (1988). Statistical power for the behavioral sciences (2nd ed.). Erlbaum.
Common, E. A., Lane, K. L., Pustejovsky, J. E., Johnson, A. H., & Johl, L. E. (2017). Functional assessment-based interventions for students with or at-risk for high-incidence disabilities: Field testing single-case synthesis methods. Remedial and Special Education, 38(6), 331352. https://doi.org/10.1177%2F0741932517693320
Cook, B. G., & Cook, S. C. (2011). Unraveling evidence-based practices in special education. The Journal of Special Education, 47(2), 7182. https://doi.org/10.1177%2F0022466911420877
Cook, B. G., & Tankersley, M. (2007). A preliminary examination to identify the presence of quality indicators in experimental research in special education. In J. Crockett, M. M. Gerber, & T. J. Landrum (Eds.), Achieving the radical reform of special education: Essays in honor of James M. Kauffman (pp. 189212). Lawrence Erlbaum
Council for Exceptional Children. (2014). Council for Exceptional Children standards for evidence-based practices in special education. Exceptional Children, 80(4), 504-511. https://doi.org/10.1177%2F0014402914531388
Dowdy, A., Peltier, C., Tincani, M., Schneider, W. J., Hantula, D. A., & Travers, J. C. (2021). Meta-analyses and effect sizes in applied behavior analysis: A review and discussion. Journal of Applied Behavior Analysis, 54(4), 13171340. https://doi.org/10.1002/jaba.862
Gage, N. A., & Lewis, T. J. (2012). Analysis of effect for single-case design research. Journal of Applied Sport Psychology, 25(1), 4660. https://doi.org/10.1080/10413200.2012.660673
Gage, N. A., Lewis, T. J., & Stichter, J. P. (2012). Functional behavioral assessment-based interventions for students with or at risk for emotional and/or behavioral disorders in school: A hierarchical linear modeling meta-analysis. Behavioral Disorders, 37(2), 5577. https://doi.org/10.1177%2F019874291203700201
Holtzman, W. H. (1963). Statistical models for the study of change in a single case. In C. W. Harris (Ed.), Problems in measuring change (pp. 99211). University of Wisconsin Press.
Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use of single-subject research to identify evidence-based practice in special education. Exceptional Children, 71(2), 165179. https://doi.org/10.1177%2F001440290507100203
Huitema, B. D. (1986). Statistical analysis and single-subject designs. In A. Poling & R. Fuqua (Eds.), Research methods in applied behavior analysis: Issues and advances (pp. 209232). Plenum.
Kazdin, A. E. (2019). Single-case experimental designs: Evaluating interventions in research and clinical practice. Behaviour Research and Therapy, 117, 317. https://doi.org/10.1016/j.brat.2018.11.015
Kazdin, A. E. (2020). Single case research designs: Methods for clinical and applied settings (3rd ed.). Oxford University Press.
Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R. Odom, S. L. Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34(1), 2638. https://doi.org/10.1177/0741932512452794
Kratochwill, T. R., & Levin, J. R. (Eds). (2014). Single-case intervention research: Methodological and statistical advances. Washington, DC: American Psychological Association
Lee, J. B., & Cherney, L. R. (2012). Tau-U: A quantitative approach for analysis of single-case experimental data in aphasia. American Journal of Speech-Language Pathology, 27(1S), 495503. https://doi.org/10.1044/2017_AJSLP-16-0197
Losinski, M., Maag, J. W., Katsiyannis, A., & Ennis, R. P. (2014) Examining the effects and quality of interventions based on the assessment of contextual variables: A meta-analysis. Exceptional Children, 80, 407-422.https://doi.org/10.1177/0014402914527243
Maag, J. W. (1989). Moral discussion group interventions: Promising technique or wishful thinking? Behavioral Disorders, 14(2), 99106. https://www.jstor.org/stable/23886160
Maag, J. W. (2018). Behavior management: From theoretical implications to practical applications (3rd ed). Cengage Learning.
Maag, J. W., & Losinski, M. (2015). Thorny issues and prickly solutions: Publication bias in meta-analytic reviews in the social sciences. Advances in Social Sciences Research Journal, 2, 242253.https://doi.org/10.14738/assrj.24.1044
Maag, J. W., Losinski, M., & Katsiyannis, A. (2014). Meta-analysis of psychopharmacologic treatment of child and adolescent depression: Deconstructing previous reviews, moving forward. Journal of Psychology and Psychotherapy, 4(5), 158167. https://doi.org/10.4172/2161-0487.1000158
Maggin, D. M., Cook, B. G., & Cook, L. (2019). Making sense of single-case design effect sizes. Learning Disabilities Research & Practice, 34(3), 124132. https://doi.org/10.1111/ldrp.12204
Magnusson, K. (2022). Interpreting Cohen’s d effect size: An interactive visualization. https://rpsychologist.com/cohend/#:~:text=The%20Cohen%E2%80%99s%20d%20effect%20size%20is%20immensely%20popular,%280.5%29%20and%20large%20%280.8%29%20when%20interpreting%20an%20effect
McNiff, M. T., Maag, J. W., & Peterson, R. L. (2019). Group video self-modeling to improve the classroom transition speeds for elementary students. Journal of Positive Behavior Interventions, 21(2), 117-127. https://doi.org/10.1177/1098300718796788
Parker, R. I., Vannest, K. J., & Brown, L. (2009). The improvement rate difference for single-case research. Exceptional Children, 75(2), 135150. https://doi.org/10.1177/001440290907500201
Parker, R. I., Vannest, K. J., & Davis, J. L. (2011). Effect size in single-case research: A review of nine nonoverlap techniques. Behavior Modification, 35(4), 303322. https://doi.org/10.1177/0145445511399147
Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). The quantitative synthesis of single-subject research: Methodology and validation. Remedial and Special Education, 8(2), 2433. https://doi.org/10.1177/074193258700800206
US Department of Education, IES, NCEE. (2012). What Works Clearinghouse: Standards handbook, version 4.0. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf
Yucesoy-Ozkan, S., Rakap, S., & Gulboy, E. (2020). Evaluation of treatment effect estimates in single-case experimental research: Comparison of twelve overlap methods and visual analysis. British Journal of Special Education, 47(1), 6787. https://doi.org/10.1111/1467-8578.12294
How to Cite
1.
Effect Sizes for Single Case Research Design Studies: Why Use them and are they Helpful?. Journal of Research and Opinion [Internet]. 2022 Feb. 15 [cited 2024 Sep. 6];9(2):3093-100. Available from: https://researchopinion.in/index.php/jro/article/view/147

Send mail to Author


Send Cancel

Custom technologies based on your needs

Journal of Research and Opinion  invites original research and review articles not published/submitted for publications anywhere. The journal accepts review articles only if author (s) has included his/her own research work and is an authority in the particular field. Invited or submitted review articles on current medical research developments will also be included. Medical practitioners are encouraged to contribute interesting case reports.

 

  • Manuscript template
  • Make a submission
  • Beta visitors

Why publish with us?

Open Access and Free

Full open-access. No processing & publication fees for authors

Refereed

The journal has rigorous peer-reviews

Indexed

The journal is indexed in DOAJ, SINTA and under review by ERIC