Skip to content

FACE autism research lab launched

Payne

For the purposes of a photo demonstration, a 12-year-old girl (who is not a patient) is outfitted with reflective markers that help autism researchers in Emerson's new FACE lab examine facial expressions. (Photo by Kelsey Davis '14)

Emerson’s Communication Sciences and Disorders Department (CSD) has established a new laboratory specifically for autism research with the help of a $1.5 million grant from the National Institutes of Health (NIH) awarded to Assistant Professor Ruth Grossman.

The four-year grant has provided for state-of-the-art equipment that tracks eye gaze and facial movements with astounding precision. The grant also supports a postdoctoral fellow, Darren Hedley, and a research assistant in Emerson’s new Facial Affective and Communicative Expressions (FACE) Lab, on the second floor of the State Transportation Building, overseen by Grossman.

This federal funding for the autism-focused work of Grossman and other researchers in CSD comes as a new U.S. Centers for Disease Control study shows one in 68 children has autism.

Grossman’s research will focus on interpretation and production of nonverbal cues, including facial expressions, by adolescents with high-functioning autism.

“The reason I am interested in studying this particular population,” Grossman said, “is that they have all the cognitive and verbal skills to be successful members of society. But they fall short of achieving their potential because they are often perceived as awkward and therefore have difficulties with their peers at school, or with getting and keeping a job. If we can find a way to help them over this hurdle of social awkwardness, it will be a tremendous benefit for this growing population.”

Grossman hopes to make progress in this area with the help of the FACE Lab’s new motion-capture equipment that can measure how different parts of the face move and interact with each other by tracking the motion of 32 reflective markers applied to the faces of participants.  

FACE Lab

In addition to reflective markers on their faces, research subjects will respond to stimuli on a video screen using video game controllers in the FACE lab. (Photo by Kelsey Davis '14)

“I want to use this technique to try to quantify ‘awkward,’” Grossman said. “If we can understand how the faces of kids with autism move differently from those of their typically developing peers and how that causes this perceived awkwardness, we will be one step closer to designing treatment approaches aimed at improving the social integration of these kids.” 

FACE Vicon

The Vicon motion-capture cameras in the FACE lab are decorated with knit caps, children's drawings, and other playful items to make the technology seem less daunting to children enrolled in the studies. (Photo by Kelsey Davis '14)

The six infrared Vicon motion-capture cameras in the lab are decorated with knit caps, children’s drawings, and other playful items to make the technology seem less daunting to children enrolled in the studies. The equipment is capable of recording three-dimensional coordinates of each marker at up to 515 frames per second. 

The young participants may be asked to use a video game controller to indicate whether images on the screen make them feel happy or sad, or will simply have to repeat a sentence or expression they see on the screen while their facial expressions are being recorded by the Vicon cameras. There is also a microphone to record their verbal responses.

FACE

In this demonstration, a girl responds to stimuli on a video screen in the FACE Lab at Emerson. (Photo by Kelsey Davis '14)

“We will also analyze their prosody, or tone of voice, as well as their vocal quality,” Grossman said. “We’ll record their voices to determine whether they are portraying the correct emotion, sound natural, and what types of pauses and rhythm they use when speaking.” 

Much of the analysis of the motion capture and voice data will be done in collaboration with researchers at other institutions, including the University of Southern California and the University of Aarhus in Denmark.

In addition to the motion capture projects, Grossman will use an infrared Remote Eye-tracker made by SensoMotoric Instruments that can take up to 500 snapshots per second of a person’s eye movement.

“Typically developing individuals tend to focus on the same areas on faces at the same time, but children with autism spectrum disorders often show significantly different gaze patterns,” Grossman said. “We want to investigate those patterns, so we can better understand what type of communicative and emotional information these kids are potentially missing by looking at the wrong places at the wrong time.”

FACE Darren

In this photo demonstration, Darren Hedley, a postdoctoral fellow who recently began working in the FACE Lab, uses a picture book with a 6-year-old girl (who is not a patient) as part of a standardized test to examine language and cognitive abilities. (Photo by Kelsey Davis '14)

Additionally, the FACE Lab will use some less technologically based techniques to work with research participants, including standardized tests to characterize the language and cognitive abilities of participants with and without autism.

The NIH grant is one of two that faculty members in the Department of Communication Sciences and Disorders received this year to aid in autism research. Rhiannon Luyster, together with three CSD colleagues, received a $41,000 grant from the National Science Foundation that aided in the purchase of a second SensoMotoric Instruments infrared eye tracker.

The FACE Lab is holding an open house for the Emerson community on Monday, April 28, from 4:00 to 6:30 pm, at its location in the State Transportation Building, second floor, Room 225. 

(Visited 51 times, 1 visits today)

Categories

Archives

Leave a Reply