Incorporating a Robot into an Autism Therapy Team

By Michael A. Goodrich, Mark Colton, Bonnie Brinton, Martin Fujiki, J. Alan Atherton, and Lee Robinson, Brigham Young University, Daniel Ricks, US Air Force Flight Test Center, Margaret Hansen Maxfield, Intermountain Health Care, Aersta Acerson, Alpine School District

NOTE: This is an overview of the entire article, which appeared in the March/April 2012 issue of the IEEE Intelligent Systems Magazine.
Click here to read the entire article.

A study conducted at Brigham Young University (BYU) incorporated a simple humanoid robot figure into an Autism therapy team. This article reports on the motivation for including a robot in the therapy, the protocol for robot interaction, and findings from the study. In addition, a sidebar article summarizes the results of other recent work involving the interaction of autism spectrum disorder (ASD) children with robots, in therapy settings.

Autism spectrum disorder refers to a group of pervasive developmental disorders that share common deficits in social interaction and communication. Such deficits may be manifested as the inability to use non-verbal behaviors (such as eye contact and facial expressions) and to regulate social interactions. Furthermore, about 50 percent of children identified with ASD present with insufficient language for effective communication. Their spoken language might be characterized by repetitive or idiosyncratic speech, and the affective component may be limited.

Social interaction doesn’t motivate or engage children with ASD the same way it does their typical peers; in fact, they might show more preference for objects over people than do their peers with other developmental disabilities. For this reason, speech and language intervention designed for children with other developmental delays might not suit the needs of this population. Children with ASD might require more extraordinary effort to elicit social interaction.

Consequently, therapies designed to assist children with ASD necessarily involve a team of people. At the BYU Comprehensive Clinic, this team includes a primary therapist, a secondary therapist, a therapy supervisor, and the child’s caregivers. Even with this extraordinary effort, interventions that reliably yield improvements in social interaction and communication are still needed, especially for very low-functioning children./p>

Evidence is growing that robots are engaging to many children across the autism spectrum. However, the desired end-point of therapy is to increase interaction with other humans, not with robots.

Fortunately, there is mounting evidence that robots can help trigger social interactions between a child and another person. Such evidence has yet to suggest improved interactions that endure or that generalize outside of a lab or clinic, however.

This article presents a description of the teaming environment used to provide therapies to children with ASD, identify the role a robot can perform on this team, and describe robot design and user interface technologies that let the robot perform this role within a broader context of the team’s shared intentions. A case study provides compelling preliminary evidence that this team-based robot-assisted approach merits careful, ongoing study.

The article begins with an exposition of the team-based approach used the the BYU Comprehensive Clinic, where the study was conducted. Children with ASD come to the clinic once or twice each week for 50-minute sessions tailored to help them in a set of specific areas. During therapy, the children interact with the team of four or more clinicians and often a parent, with five to 10 potential activities per session.

Interactions with the robot only account for 10 minutes of the 50 minute session. They refer to this limit as a “low-dose” role, targeting two specific functions: engaging the child and catalyzing social interactions. Our results suggest that such a low-dose robotics approach can not only simplify a robot’s design and behaviors but also produce generalizable child behaviors. By designing the robot to be part of the therapy team, they can simplify the robot design problem and make the therapies more effective.

Researchers and clinicians at the BYU Comprehensive Clinic identified two clear criteria that the robot must satisfy to fulfill a role in the ASD therapy team. First, it must have a form factor, appearance, and mobility that is likely both to engage a child with ASD and to trigger dyadic and triadic social interactions. Second, it must be capable of performing several pro­ social clinical activities such as taking turns, imitating movement, and performing songs with actions.

With respect to the first criterion, the robot (called Troy) was designed to be the same size as an average 4-year-old child. Troy is 25 inches tall with two arms 12 inches in length (see Figure 1a). Each of Troy’s arms has 4 degrees of freedom: raising and lowering, adduction and abduction, medial rotation of the forearm, and extension and flexion of the elbows. The robot uses its arms for simple interaction activities, such as pushing a toy car, pushing buttons, pointing, waving hello, and recreating the hand actions associated with children’s songs (see Figure 1b). The robot doesn’t need to move around a room, so Troy is a stationary upper-body robot that can sit on the ground or a table while therapists control its interactions.

Figure 1. The robot Troy. (a) The robot had a seven-inch computer screen for a face and movable arms; (b) a clinician used a Wii controller to direct Troy to move, change its display, and speak or sing; and (c) Troy's face could express a range of basic emotions.

Figure 1. The robot Troy. (a) The robot had a seven-inch computer screen for a face and movable arms; (b) a clinician used a Wii controller to direct Troy to move, change its display, and speak or sing; and (c) Troy’s face could express a range of basic emotions.(Photo courtesy of University of Catania.)

The article describes the manner in which the robot interacts with the children, and how the robot is controlled by the therapists. The authors developed a prototype drag-and-drop visual programming user interface that lets a therapist arrange existing robot behaviors in sequences and map those sequences to an input device. The interaction sequences were programmed before sessions by therapists, with contingengy plan programs in place. In the sessions, therapists triggered the robot behavior using a Nintendo Wii remote (chosen for its ease of use and inconspicuous appearance).

Two children with ASD participated in the study. Both boys demonstrated moderate to severe levels of impairment in social communication, as well as restricted interests and repetitive behaviors (such as spinning a puzzle piece or fixating on the same toy). Neither boy had made marked progress on intervention goals in the previous six months. Each participant came for 16 treatment sessions over a three-month period.

One of the boys exhibited dramatic increases in socially engaged behaviors from before to after. The other boy’s gains were more modest. Clinical observations indicated that both children were highly motivated to interact with the robot, and both were more interactive with clinicians without the robot following treatment. The clinicians observed several significant behaviors (in the absence of the robot) after treatment that had not been observed before, including greeting clinicians by waving (and in one case, saying their names), symbolic pretend play with toys, sharing toys, and decreased restricted interests and repetitive behaviors.

The authors conclude with the following assessment:
“Our case study and technology development have uncovered many important challenges and research opportunities. One of the most critical is to identify clinical approaches that allow us to reach statistically sound conclusions for a problem with as many confounding factors as autism therapy. Such factors include frequently sharp differences in behaviors among children with autism; the complexity of coordinating therapy across a large team of therapists, caregivers, and educators; and the vast range of potential robot and user interface technologies.”


Information about each of the authors and contact information can be found in the article.