Assessment & Research

Improving Psychological Science through Transparency and Openness: An Overview

Hales et al. (2019) · Perspectives on Behavior Science 2019
★ The Verdict

Use free open-science tools to make your behavior-analytic studies transparent, replicable, and more trusted.

✓ Read this if BCBAs who publish or supervise student theses.
✗ Skip if Practitioners who only read journals and never produce data.

01Research in Context

01

What this study did

Hales et al. (2019) wrote a plain-language tour of open-science tools. They show how preregistration, open data, and transparent reporting cost almost nothing yet make behavior-analytic work easier to trust and repeat.

The paper is a narrative review, not an experiment. It gathers checklists, free websites, and journal policies you can start using today.

02

What they found

The authors found that most quality problems in psychology come from hidden flexibility. Preregistering your single-subject design closes that loophole without extra grant money.

They also found that sharing data and code lets other BCBAs rerun your graphs in minutes. This speeds up clinical decisions and builds a shared evidence base.

03

How this fits with other research

Sham et al. (2014) prove the need Hales describes. Their audit of PRT studies shows published papers claim 22 PND points higher effectiveness than dissertations. Preregistration would have exposed that gap before the journals printed it.

St. Peter et al. (2023) extend the same honesty call to procedural fidelity. Their focus groups reveal veterans often skip fidelity reports; Hales gives the free tools to fix it.

Anonymous (2023) shows what happens when guidelines lack the rigor Hales promotes. Their review of 70 rare-disorder guidelines finds most miss basic transparency steps. Using Hales’ checklist would raise those standards overnight.

04

Why it matters

You can boost your next study’s credibility in one lunch break. Pick your dependent variable, write it on the Open Science Framework, and date-stamp the file. Share your raw data and R code when you submit. Reviewers and RBTs will thank you, and your clinical recommendations will carry more weight with funders and families.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Create a free OSF account and preregister your next single-subject design before the first session.

02At a glance

Intervention
not applicable
Design
narrative review
Finding
not reported

03Original abstract

The ability to independently verify and replicate observations made by other researchers is a hallmark of science. In this article, we provide an overview of recent discussions concerning replicability and best practices in mainstream psychology with an emphasis on the practical benefists to both researchers and the field as a whole. We first review challenges individual researchers face in producing research that is both publishable and reliable. We then suggest methods for producing more accurate research claims, such as transparently disclosing how results were obtained and analyzed, preregistering analysis plans, and publicly posting original data and materials. We also discuss ongoing changes at the institutional level to incentivize stronger research. These include officially recognizing open science practices at the journal level, disconnecting the publication decision from the results of a study, training students to conduct replications, and publishing replications. We conclude that these open science practices afford exciting low-cost opportunities to improve the quality of psychological science.

Perspectives on Behavior Science, 2019 · doi:10.1007/s40614-018-00186-8