Validity and Reliability of GraphClick and DataThief III for Data Extraction.
GraphClick and DataThief III pull numbers from single-subject graphs as well as costly software—pick one and move on.
01Research in Context
What this study did
McDuffie et al. (2016) tested two cheap computer programs.
GraphClick and DataThief III pull numbers from published line graphs.
The team compared the extracted numbers to the true values.
They wanted to know if the tools are accurate enough for meta-analysis.
What they found
Both programs hit the mark.
Extracted points matched the original data.
You can trust either tool for single-subject graphs.
How this fits with other research
Manolov et al. (2017) and Ruiz et al. (2025) built the next step.
After you pull the numbers, their free R code graphs and analyzes them.
The 2016 paper shows the data are clean; the newer papers show what to do with them.
Mitteer et al. (2018) closes the loop.
They taught RBTs to make publication-ready graphs in Prism.
Good graphs in, good data out—Andrea’s finding still holds.
Why it matters
If you run a meta-analysis or train staff, you now have a full pipeline.
Extract points with GraphClick or DataThief, analyze with Rumen’s R tools, and graph with Ruiz’s package.
No expensive software needed.
Start using the free chain this week to speed up reviews or class projects.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Download GraphClick, open a PDF of an old A-B-A-B graph, extract the data, and paste it into Excel to practice.
02At a glance
03Original abstract
Researchers frequently rely on meta-analyses of prior research studies to efficiently evaluate a broad spectrum of results on a particular topic. In the realm of single-subject experimental designs (SSEDs), meta-analyses have a particular cachet: retaining the rigor of single-subject designs with the added robustness of replication to more fully determine the strength of a given approach or intervention. Until recently, researchers wishing to undertake meta-analytic research themselves have had limited options for synthesizing the intervention effects of a collection of studies. Researchers consistently use two software programs, DataThief III and GraphClick, to conduct meta-analytic work using SSEDs. The purpose of this study was to evaluate and compare the validity and reliability of the results yielded by each of these programs when evaluating the results of multiple research studies on the Good Behavior Game, a classroom-based intervention that has been in practice since 1969. Study findings suggest that both GraphClick and DataThief III provide valid methods of data extraction. In addition, both programs allow for reliable extraction of data between raters and between software programs. Limitations and directions for future research are explored.
Behavior modification, 2016 · doi:10.1177/0145445515616105