Questions and answers to psychological assessment schedules: hidden troubles in 'quality of life' interviews.
Your QoL interview may be twisting answers before you write them down—listen to the tape to catch the drift.
01Research in Context
What this study did
The authors listened to taped quality-of-life interviews with adults who have intellectual disabilities.
They used conversation analysis to see how the interviewer and the adult changed each other’s words.
The goal was to check if the final written answers still matched what the person really meant.
What they found
Interviewers often re-phrased questions without noticing.
Adults then gave answers that fit the new wording, not the original question.
These small shifts made the final scores unreliable.
How this fits with other research
Schmidt et al. (2010) later showed the same problem in numbers: proxies usually rate QoL lower than the adults themselves, just as Duker et al. (1996) predicted would happen when the talk is tilted.
Symons et al. (2005) found no link between objective living conditions and interview QoL scores, backing the idea that the talk, not the life, drives the numbers.
Cançado et al. (2011) reviewed 16 family QoL tools and echoed the warning: most scales skip the rigor needed to catch these distortions.
Together the papers say, “Don’t trust one source—check how the question was asked and who answered.”
Why it matters
If you use QoL data to justify funding or placement, slow down. Play the tape or watch yourself. Note every time you re-word a question. Try asking the same item twice in different words and see if the answer changes. Small fixes like this keep your data honest and your plans truly client-centered.
Want CEUs on This Topic?
The ABA Clubhouse has 60+ free CEUs — live every Wednesday. Ethics, supervision & clinical topics.
Join Free →Record your next QoL interview, then replay five minutes and count how many times you re-phrase a question—revise your script to stay verbatim.
02At a glance
03Original abstract
Quality of life (QOL) has become a topic of much debate in the learning difficulties literature. Increasing use is made of questionnaire-driven interview schedules in an effort to find out what clients believe in their own words. However, in this paper, the authors argue that the use made of such questionnaires may actually distort interviewees' 'own words' by severely underestimating the degree to which the questions and answers are changed by the subtle dynamics of the interview. In the first ever close examination of what actually happens in a QOL assessment interview, the qualitative insights of conversation analysis are used to show that the typical administration of a well-known instrument will involve: (1) distortions of the questions brought about by the need to paraphrase complex items, and the inevitable use of pre-questions and response listing; and perhaps more disturbingly, (2) distortions of answers brought about by interviewers' pursuit of legitimate answers and non-take-up of interviewees' matters. The authors believe that these difficulties make it hard for researchers to draw conclusions from simple aggregation of recorded responses to this questionnaire, and, perhaps, to any questionnaire using a fixed-response schedule. On the other hand, the kind of close evidence used here may allow inferences to be drawn about clients' feelings of well being, but even so, these will need to be cast in terms which acknowledge the interactive and constructive nature of feeling-avowals.
Journal of intellectual disability research : JIDR, 1996 · doi:n/a