Our brains “read” illusory facial expressions in things like real faces



[ad_1]

Wall in Tuscany with a smiling face
Enlarge / This wall in an old town in Tuscany, Italy, illustrates the phenomenon of facial pareidolia, or seeing faces in things.

Humans are champions at spotting patterns, especially faces, in inanimate objects. Think of the famous “face on Mars” in images taken by the Viking 1 orbiter in 1976, which is essentially a play of light and shadow. And people still spot what they believe to be the face of Jesus in burnt toast and many other (so many) ordinary foods. There was even a Twitter account now missing devoted to the conservation of images of the phenomenon of “faces in things”.

The fancy name of the phenomenon is facial pareidolia. Scientists at the University of Sydney have found that not only do we see faces in everyday objects, but our brains even process objects for emotional expression much like we do for real faces rather than to reject the objects as “false” detections. This shared mechanism may have evolved due to the need to quickly judge whether a person is a friend or an enemy. The Sydney team described their work in a recent article published in the journal Proceedings of the Royal Society B.

Senior author David Alais, of the University of Sydney, told The Guardian:

We are such a sophisticated social species and facial recognition is very important … You have to recognize who it is, is it family, is it a friend or an enemy, what are their intentions and their emotions? Faces are detected incredibly quickly. The brain seems to do this using some sort of pattern matching procedure. So if he sees an object that appears to have two eyes above a nose above a mouth, then he says, “Oh, I see a face”. It’s a bit quick and loose and sometimes it makes mistakes, so something that looks like a face will often trigger this pattern match.

Alais has been interested in this and related topics for years. For example, in a 2016 article published in Scientific Reports, Alais and his colleagues drew on previous research involving rapid sequences of faces that demonstrated that the perception of facial identity, as well as attractiveness, is biased towards recently seen faces. So they designed a binary task that mimicked the interface for selecting online dating sites and apps (like Tinder), in which users swipe left or right to see if they judge partner profile photos. attractive or unattractive potentials. Alais et al. found that many stimulus attributes, including facial orientation, expression and attractiveness, and the perceived thinness of online dating profiles, are consistently biased in favor of recent past experience.

This was followed by a 2019 article in the Journal of Vision, which extended this experimental approach to our appreciation of art. Alais and his co-authors discovered that we don’t evaluate every painting we see in a museum or gallery on its own merits. They also found that we are prone to a “contrast effect” – that is, perceiving a painting as more attractive if the work we have seen before was less aesthetically appealing. Instead, the study found that our appreciation of the art shows the same systemic bias of “serial addiction.” We judge paintings to be more attractive if we look at them after seeing another attractive painting, and we rate them less attractive if the previous painting was also less aesthetically appealing.

The next step was to examine the specific brain mechanisms behind how we “read” social information on other people’s faces. The phenomenon of facial pareidolia appeared to be related to Alais. “A striking feature of these objects is that they not only look like faces, but can even convey a sense of personality with social significance,” he said, like a sliced ​​pepper that appears to be scowling or a dispenser of. towels that appear to be smiling.

Facial perception involves more than the characteristics common to all human faces, such as the placement of the mouth, nose, and eyes. Our brains may be in tune with these universal patterns, but reading social information requires being able to determine if someone is happy, angry, or sad, or if they are paying attention to us. Alais’s group designed a sensory adaptation experiment and determined that we do indeed treat facial pareidolia the same way we treat real faces, according to an article published last year in the journal Psychological Science.

This latest study admittedly has a small sample size: 17 college students, all of whom performed field trials with eight real faces and eight images of pareidolia before the experiments. (Data from the trial was not recorded.) The actual experiments used 40 real faces and 40 sarongan images, selected to include expressions ranging from anger to joy and fell into four categories: very en angry, weakly angry, weakly happy, and very happy. During the experiments, each picture was briefly shown to the subjects and then rated emotional expression on the Anger / Joy Rating Scale.

The first experiment was designed to test the effects in series. Subjects completed a sequence of 320 trials, each of the images being shown eight times in random order. Half of the subjects finished the game using real faces first and the pareidolia images second. The other half of the subjects did the opposite. The second experiment was similar, except that the actual faces and the pareidolia images were randomly combined in the trials. Each participant rated a given image eight times, and these results were averaged into an average estimate of image expression.

“What we found is that, in fact, these pareidolia images are processed by the same mechanism that would normally process the emotions of a real face,” Alais told The Guardian. “You’re somehow unable to totally turn off that facial response and that emotional response and see it as an object. It remains both an object and a face.

Specifically, the results showed that subjects could reliably assess pareidolia images for facial expression. The subjects also showed the same serial addiction bias as Tinder users or art gallery patrons. That is, a happy or angry illusory face in an object will be seen to have an expression more similar to the previous one. And when real faces and pareidolia images are mixed, as in the second experiment, this serial dependence was more pronounced when subjects saw the pareidolia images before human faces. Alais et al. concluded that this indicates an underlying mechanism shared between the two, meaning that “the processing of expression is not closely related to human facial features,” they wrote.

“This ‘crossover’ condition is important because it shows that the same underlying facial expression process is involved regardless of the type of image,” Alais said. “It means that seeing faces in the clouds is more than a child’s fantasy. When objects convincingly resemble faces, that’s more than an interpretation: they’re really driving your brain’s face detection network. facial expression system at work. For the brain, fake or real, faces are all treated the same. “

DOI: Proceedings of the Royal Society B, 2021. 10.1098 / rspb.2021.0966

DOI: Psychological Science, 2020. 10.1177 / 0956797620924814 (About DOIs).



[ad_2]

Source link