Frazier, Thomas W., Whitehouse, Andrew J. O., Leekam, Susan R. ORCID: https://orcid.org/0000-0002-1122-0135, Carrington, Sarah J., Alvares, Gail A., Evans, David W., Hardan, Antonio Y. and Uljarević, Mirko 2024. Reliability of the commonly used and newly-developed autism measures. Journal of Autism and Developmental Disorders 54 , pp. 2158-2169. 10.1007/s10803-023-05967-y |
PDF
- Accepted Post-Print Version
Download (394kB) |
Abstract
Purpose The aim of the present study was to compare scale and conditional reliability derived from item response theory analyses among the most commonly used, as well as several newly developed, observation, interview, and parent-report autism instruments. Methods When available, data sets were combined to facilitate large sample evaluation. Scale reliability (internal consistency, average corrected item-total correlations, and model reliability) and conditional reliability estimates were computed for total scores and for measure subscales. Results Generally good to excellent scale reliability was observed for total scores for all measures, scale reliability was weaker for RRB subscales of the ADOS and ADI-R, reflecting the relatively small number of items for these measures. For diagnostic measures, conditional reliability tended to be very good (> 0.80) in the regions of the latent trait where ASD and non-ASD developmental disability cases would be differentiated. For parent-report scales, conditional reliability of total scores tended to be excellent (> 0.90) across very wide ranges of autism symptom levels, with a few notable exceptions. Conclusions These findings support the use of all of the clinical observation, interview, and parent-report autism symptom measures examined, but also suggest specific limitations that warrant consideration when choosing measures for specific clinical or research applications.
Item Type: | Article |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Psychology |
Publisher: | Springer |
ISSN: | 1573-3432 |
Date of First Compliant Deposit: | 15 May 2023 |
Date of Acceptance: | 11 March 2023 |
Last Modified: | 09 Nov 2024 16:00 |
URI: | https://orca.cardiff.ac.uk/id/eprint/159485 |
Actions (repository staff only)
Edit Item |