Assessment & Research

Percentage agreement and phi: A conversion table.

Lewin et al. (1979) · Journal of applied behavior analysis 1979
★ The Verdict

Add the phi coefficient from the table to every percent-agreement score so your reliability data can travel across studies.

✓ Read this if BCBAs who publish single-case data or sit on thesis committees.
✗ Skip if Practitioners who only share data inside one team and do not publish.

01Research in Context

01

What this study did

Goldman et al. (1979) built a look-up table. You enter any percentage-agreement score. The table gives you the matching phi coefficient.

Phi is a chance-corrected number, like kappa. It lets readers compare reliability across studies even when base rates differ.

02

What they found

The table works for any agreement value from 0 to 100%. No math needed. One glance turns percent agreement into a statistic that accounts for chance.

03

How this fits with other research

Jones et al. (1977) came first. They gave a formula that also removes chance agreements. Goldman et al. (1979) make the idea usable by hiding the algebra in a table.

Friedling et al. (1979) disagree. They say the table is overkill. Their 10% disagreement rule is faster: if disagreements are 10% or less, chance correction is unnecessary.

The clash is only skin-deep. C et al. tested common, balanced data sets. When base rates get extreme, the phi table still saves you from inflated agreement.

04

Why it matters

Next time you write 95% agreement, add the phi from the table. Reviewers see you controlled for chance, and meta-analysts can fairly compare your numbers with other labs. It takes ten seconds and ends arguments about inflated reliability.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Open the article, copy the table, tape it to your clipboard, and write both percent agreement and phi on your next IOA sheet.

02At a glance

Intervention
not applicable
Design
methodology paper
Finding
not reported

03Original abstract

STUDIES IN APPLIED BEHAVIOR ANALYSIS HAVE USED TWO EXPRESSIONS OF RELIABILITY FOR HUMAN OBSERVATIONS: percentage agreement (including percentage occurrence and percentage nonoccurrence agreement) and correlational techniques (including the phi coefficient). The formal relationship between these two expressions is demonstrated, and a table for converting percentage agreement to phi, or vice-versa, is presented. It is suggested that both expressions be reported in order to communicate reliability unambiguously and to facilitate comparison of the reliabilities from different studies.

Journal of applied behavior analysis, 1979 · doi:10.1901/jaba.1979.12-299