A Duke University study, published in February in the Journal of Pediatrics shows that, compared with the traditional paper-based assessment, a digital version of a screening tool is more effective at prompting pediatricians to refer children for further evaluation.
“It shows you the power of technology,” says lead researcher Geraldine Dawson, director of the Duke Center for Autism and Brain Development at Duke University in Durham, North Carolina. “When you have a very busy pediatrician, if you can do anything to make [screening] more efficient, that’s going to improve the quality of care.”
The American Academy of Pediatrics recommends routinely screening toddlers for autism at 18 and 24 months of age. Pediatricians typically use the Modified Checklist for Autism in Toddlers (M-CHAT), a questionnaire that asks parents about eye contact and other behaviors in their child. If a child shows autism traits on the measure, clinicians are supposed to ask parents a set of follow-up questions and recommend next steps based on the responses.
This follow-up can add up to 30 minutes to a routine visit, Dawson says. As a result, many clinicians skip it, potentially missing some children with autism or mistakenly flagging children without the condition.
Dawson and her team created an app that delivers the M-CHAT to parents on a tablet. The app scores the questionnaire automatically and administers follow-up questions to parents when necessary. It then generates a report with the child’s score and suggests recommendations for the clinician, such as referring the child to a developmental specialist.
The researchers put the app to use in a pediatric clinic at Duke University for six months. Parents of 529 children aged 16 to 30 months completed the M-CHAT during this trial period. The researchers compared these results with those from 649 parents who completed the paper-based version of the same screen.
Both platforms flagged similar numbers of children for autism signs. But the proportion of children whose parents answered follow-up questions and were referred for further evaluation increased from 25 percent to 85 percent after the researchers introduced the app.
The findings hint at an easy way to make autism screening more efficient and effective, says Wendy Stone, professor of psychology at the University of Washington in Seattle, who was not involved in the study. “It’s very good preliminary evidence that this is a feasible practice for offices to adopt,” she says.
Dawson and her team plan to test the app in more clinics. They hope ultimately to pair the app with a tablet-based video program that can record and analyze autism behaviors in children during clinic visits.