Half a century ago, one of the hottest questions in science was whether humans could teach animals to talk. Scientists are experimenting with using sign language to talk to apes and training parrots to use their ever-increasing English vocabulary.
The work quickly attracted media attention and controversy.Critics say the study lacks rigor and that seemingly animal-to-animal communication may be just wishful thinking, with researchers unconsciously implying that their animals respond in a certain way..
In the late 1970s and early 1980s, this research fell out of favor. “The whole field has completely collapsed,” said Irene Pepperberg, a comparative cognition researcher at Boston University who is best known for her research on an African gray parrot named Alex.
Today, advances in technology and a growing awareness of the complexity of animal minds are fueling renewed interest in finding ways to bridge the species divide. Pet owners are teaching their dogs to press the “talk button,” and zoos are training their orangutans to use touch screens.
In a cautious new paper, a group of scientists outlines a framework for assessing whether these tools could provide animals with new ways to express themselves. Jennifer Cunha, a visiting researcher at Indiana University, said the study aimed to “move beyond some of the controversial issues of the past.”
The paper, which will be presented at a scientific conference on Tuesday, focuses on Ms Cunha’s parrot, an 11-year-old Goffin’s Cockatoo named Ellie. Since 2019, Ms. Cunha has been teaching Ellie to use the interactive “Speech Board,” a tablet-based app with more than 200 illustrated icons corresponding to “sunflower seeds,” “happy ” and “I think” and other words and phrases. hot. ” When Ellie presses the icon with her tongue, a computer voice speaks the word or phrase out loud.
In the new study, Ms. Cunha and her colleagues did not set out to determine whether Ellie’s use of the voice board constituted communication. Instead, they used quantitative computational methods to analyze Ellie’s icon presses to learn more about whether the voice board had what they called “expressive and enriching potential.”
“How do we analyze this expression to see if there is intention or room for communication?” Ms. Cunha said. “Then, the second question is, do her choices tell us anything about her values and what she finds meaningful?”
The scientists analyzed nearly 40 hours of film footage collected over seven months of Ellie using the voice board. They then compared her icon presses to several simulations of a hypothetical voice board user who randomly selected icons.
“They ended up being significantly different from the real data at multiple points,” said Nikhil Singh, the MIT doctoral student who created the models. “Our virtual users couldn’t quite capture the real Ellie using the tablet. What you did at the time.”
In other words, whatever Ellie is doing, it doesn’t seem like she’s simply smashing icons around. The researchers found that the voice board’s design, including icon brightness and placement, also didn’t fully explain Ellie’s choices.
Determining whether Ellie’s choice was random “is a really good place to start,” says Federico Rossano, a comparative cognition researcher at the University of California, San Diego, was not involved in the study. “The problem is that the chance of randomness is very small.”
Dr. Rosano says just because Ellie isn’t randomly clicking icons doesn’t mean she’s actively and intentionally trying to express her true needs or feelings. She may simply be repeating the sequence she learned during training. “It’s like a vending machine,” he said. “You can learn to push a series of numbers and get some type of reward. It doesn’t mean you’re thinking about what you’re doing.
To further explore the possibilities, the team then looked for so-called “corroborating” signs. If Ellie chooses the apple icon, will she eat the apple given to her? If she selected a reading-related icon, did she engage with the book for at least a minute?
“You can hand something to a bird and they will drop it or touch it,” Ms. Cunha said. “But the question for us is, was she involved?”
Not all of Ellie’s choices could be evaluated this way; for example, it would be impossible for researchers to determine whether she actually felt happy or hot at any given moment. But of the nearly 500 icon presses that could be assessed, 92 percent were confirmed by Ellie’s subsequent actions.
“Clearly there is a very good correlation,” said Dr. Pepperberg, who was not involved in the study.
But she said additional testing would be needed to prove Ellie truly understands what the icons meant, and she suggested researchers try deliberately giving Ellie the wrong objects. See how she responds. “This is just another control to make sure the animals actually understand what the label represents,” Dr. Pepperberg said.
Finally, the researchers sought to evaluate whether the speech board was a form of rich content for Ellie by analyzing the types of icons that she most frequently selected.
“If it’s a means to an end, then what is the end?” said paper author Rébecca Kleinberger, a researcher at Northeastern University who studies how animals interact with technology. “It does appear that there is a bias against social activities or activities that require interaction with a caregiver.”
The researchers found that about 14 percent of the time, Ellie chose an illustration of a food, drink, or snack. On the other hand, approximately 73% of the choices corresponded to activities that provided social or cognitive enrichment, such as playing a game, visiting another bird, or simply communicating with Ms. Cunha. Ellie also actively used the voice board 85% of the time.
Amalia Bastos, a comparative cognition researcher at Johns Hopkins University who was not an author on the paper, said: “The cockatoo’s consistent interactions with her devices were consistent. showed that the device continued to be engaging and reinforcing for her interactions over several months.
This study has its limitations. Outside experts said there were limits to what scientists could draw from a single animal, and it was difficult to rule out the possibility that Ms. Cunha might have been unconsciously suggesting that Ellie reacted in certain ways. But scientists also praised the researchers’ systematic approach and humble claims.
“They don’t ask, ‘Can parrots talk?'” Dr. Rosano said. “They asked, ‘Can this be used for enrichment?'”
Dr. Bastos agreed. “This work is a critical first step,” she said. It’s also an example of how the field has changed since the 1970s, for the better.
“Researchers currently working in the field are not proposing the same hypothesis,” Dr. Bastos said. “We don’t expect animals to understand or use language like humans do.” Instead, she added, scientists are interested in using communication tools to “improve the welfare of captive animals and their relationships with their keepers.”