Assessment & Research

The impact of vision in spatial coding.

Papadopoulos et al. (2011) · Research in developmental disabilities 2011
★ The Verdict

Train systematic hand scanning—blind learners who use it code space as well as blindfolded sighted peers.

✓ Read this if BCBAs teaching spatial concepts to blind or low-vision clients
✗ Skip if Clinicians whose caseload has no vision impairments

01Research in Context

01

What this study did

Papadopoulos et al. (2011) asked how people map space without sight. They tested three groups: blind adults, sighted adults wearing blindfolds, and sighted adults with full vision.

Each person felt raised-line shapes on a board. They had to remember where each shape sat. The task measured spatial coding through touch alone.

02

What they found

Vision helped. Sighted people with eyes open placed shapes most accurately. Yet blind participants who used a careful hand-sweep pattern matched the blindfolded sighted group.

Good haptic strategy closed the gap. Systematic scanning let blind learners code space as well as sighted peers who could not see.

03

How this fits with other research

The same lab ran a sister study the same year. Papadopoulos et al. (2011) gave blind, blindfolded, and sighted groups a water-level task. Again, blind adults scored better than blindfolded sighted adults. Together the papers show that when vision is blocked, people who are blind can equal or beat sighted peers on spatial jobs.

Cimolin et al. (2011) seems to disagree. They found that Prader-Willi patients balanced the same with eyes open or closed, while typical controls wobbled more when vision was removed. The difference is population: Prader-Willi syndrome changes how the brain weighs vision. In typical adults vision clearly boosts spatial tasks; in Prader-Willi it does not.

Babai et al. (2020) extends the story. They showed that blind adults still fall for area interference when judging perimeter by touch. Good haptic scanning helps spatial coding, but it does not erase all visual biases.

04

Why it matters

You can teach touch-based scanning the same way you teach visual discrimination. Model a left-to-right, top-to-bottom hand sweep. Give learners time to explore materials with their hands before asking them to place or draw items. For clients with no vision, this simple routine can lift spatial accuracy to sighted levels.

Free CEUs

Want CEUs on This Topic?

The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.

Join Free →
→ Action — try this Monday

Demonstrate a slow, orderly hand sweep across tactile maps or raised worksheets and prompt learners to repeat the pattern before answering location questions.

02At a glance

Intervention
not applicable
Design
quasi experimental
Sample size
48
Finding
positive

03Original abstract

The aim of this study is to examine the performance in coding and representing of near-space in relation to vision status (blindness vs. normal vision) and sensory modality (touch vs. vision). Forty-eight children and teenagers participated. Sixteen of the participants were totally blind or had only light perception, 16 were blindfolded sighted individuals, and 16 were non-blindfolded sighted individuals. Participants were given eight different object patterns in different arrays and were asked to code and represent each of them. The results suggest that vision influences performance in spatial coding and spatial representation of near space. However, there was no statistically significant difference between participants with blindness who used the most effective haptic strategy and blindfolded sighted participants. Thus, the significance of haptic strategies is highlighted.

Research in developmental disabilities, 2011 · doi:10.1016/j.ridd.2011.07.041