Over the course of the next couple of months I plan to use these blog posts to explore topics dealing with how emotion influences the processes involved in person perception – the way we form impressions and make inferences about other people. In this inaugural post I want to delve into the processes guiding face perception as it relates to the perception of emotion.
Before we begin, I’ll take a moment to define a couple of terms that I use a lot when I talk about the sort of research that I like to talk about. These terms don’t necessarily carry the same definitions when we use them colloquially, so I think they bear some clarification. First, I talk a lot about social interactions (because that’s what I study!). In the context of empirical research, a social interaction describes the process of acting and reacting to people around us. An everyday example of a social interaction is two people having a conversation. When we try to study this sort of thing in the lab, we often need to isolate one or two very small aspects of the phenomenon we want to study. So we often operationalize aspects of a social interaction by defining the perceiver and the target. The perceiver is the person who is doing the looking or evaluating in a social interaction. The perceiver’s behavior in a social interaction happens in response to a target, which can be any sort of stimulus information about a person (e.g., a picture). Certainly, in real life the roles of perceiver and target are fluid in the context of an interaction—it depends on whose perspective you’re talking about. In the lab we tend to isolate one from the other so we can better understand the various processes underlying a given social interaction.
Ok, now that we have some common ground, let’s get to it.
Two paths: facial identity and facial expression
Back in 1986 Bruce and Young proposed a dual-systems model for face perception. Long story short, they suggested that when it comes to face perception, the perceiver uses two different routes for processing. Route 1 governs processing of information about facial identity, while route 2 governs processing of information about facial expression. About 15 years later Habxy and colleagues (2000) provided a compatible account of the neural substrates that underlie processes guiding face perception. By their account, system 1 governs static face information (for instance, face identity). These processes are implicated in similar brain regions, a major one being the fusiform gyrus (especially the fusiform face area [FFA]). On the other hand, system 2 governs changeable face information (for instance, facial expression). This sort of information tends to activate regions of the superior temporal sulcus (STS) in processing. We’ll keep these theories of how the brain processes face- and emotion-specific information and the underlying neural mechanisms in mind as we consider more recent research in the field of emotion and person perception.
Facial expression and memory for face identity
D’Argembeau and Van der Linden (2007) explored the influence of emotional face expressions on a perceiver’s automatic memory for the target’s face identity. Across two studies they explored whether perceivers would demonstrate superior memory when instructed to attend to a target’s emotional expressions (i.e., define the intensity of the expression), their personality trait (i.e., imagine a personality trait), or a specific structural feature of the face (i.e., indicate size of the nose). The results suggest that regardless of which aspects of a person’s face participants were instructed to attend to, they demonstrated better memory for face identities that expressed happiness compared to anger.
Overall, these results have some interesting implications for social interactions. First of all, they suggest that happy (compared to angry) faces are easier for a perceiver to process and subsequently remember. (So next time you want to make a lasting impression on a new acquaintance remember to smile ☺) This pattern may arise because when a face conveys anger, the perceiver’s attentional resources are averted to dealing with the potential threat in the environment. This could overwhelm a perceiver’s cognitive resources, preventing the perceivers from attending to and encoding the identity of the angry face.
Are there two separate systems?
These results put forth some evidence against the two-systems model for the perception of face identity and face expression. For instance, the finding that a target’s emotional expression could interfere with the processing and memory of the face suggests that the two systems may not be fully independent as originally suggested.
Indeed, recent work from den Stock and de Gelder (2014) provide compelling evidence against the theory of two independent systems. They presented participants with face/body compound stimuli and asked them to match the identity of the face in the original picture to two alternative choices. Participants demonstrated superior performance in the task when they were asked to match the identity of a neutral (compared to a happy or angry) face. Furthermore, performance was better for the emotional stimuli if the facial expression matched the bodily expression. These results demonstrate that task irrelevant emotional face and body expressions interfered with participants’ ability to process the identity of target faces. Thus, we find that perhaps the two systems originally posited for processing facial identity and facial expression may actually more intertwined than we once thought.
Bruce, V. & Young, A. W. Understanding face recognition. British Journal of Psychology. 77, 305–327 (1986).
Calder, A. J., & Young, A. W. (2005). Understanding the recognition of facial identity and facial expression. Nature Reviews. Neuroscience, 6(8), 641–51. doi:10.1038/nrn1724.
D’Argembeau, A., & Van der Linden, M. (2007). Facial expressions of emotion influence memory for facial identity in an automatic way. Emotion (Washington, D.C.), 7(3), 507–15. doi:10.1037/1528-35188.8.131.527.
den Stock, J. Van, & de Gelder, B. (2014). Face identity matching is influenced by emotions conveyed by face and body. Frontiers in Human Neuroscience, 8(February), 53. doi:10.3389/fnhum.2014.00053.
Haxby, J., Hoffman, E., & Gobbini, M. (2000). The distributed human neural system for face perception. Trends in Cognitive Sciences, 4(6), 223–233. doi:10.1016/S1364-6613(00)01482-0
Phelps et al. (2014) – authors of the review we read focusing on emotion effects on decision making – would not be surprised to see evidence suggesting that a dual-system approach falls short.
This post got me wondering about the role of social context. Being exposed to disembodied, static faces in a scanner isn’t exactly the same as interacting with someone whose expressions change dynamically in accordance with the ebb and flow of real conversation.