User:Btleitzke/draft article on emotion perception

From Wikipedia, the free encyclopedia

Emotion perception refers to the capacities and abilities of recognizing and identifying emotions in others, in addition to biological and physiological processes involved. Emotions are typically viewed as having three components: subjective experience, physical changes, and cognitive appraisal; emotion perception is the ability to make accurate decisions about another’s subjective experience by interpreting their physical changes through sensory systems responsible for converting these observed changes into mental representations. The ability to perceive emotion is believed to be both innate and subject to environmental influence and is also a critical component in social interactions. How emotion is experienced and interpreted depends on how it is perceived. Likewise, how emotion is perceived is dependent on past experiences and interpretations. Emotion is present, expressed, and can be accurately perceived in humans, but while emotion is believed to be expressed by most animals, emotions vary considerably across species and the ability to perceive accurately the emotions of other animals is difficult. Emotions can be perceived visually, audibly, through smell and also through bodily sensations and this process is believed to be different from the perception of non-emotional material.

Modes of Perception[edit]

The ability to perceive emotion can be conducted through visual, auditory, olfactory, and physiological sensory processes. Because emotion is exclusive to living beings and only accurately assessed in concomitants, one needs some form of information of a social partner before making an affective decision. This information is believed to hold special importance and sensory systems and certain brain regions are suspected to specialize to decode emotional information for rapid and efficient processing.

Visual[edit]

The visual system is the primary mode of perception for the way people receive emotional information. Individuals view another in a certain situation while experiencing a particular emotion and then use this information to make a decision about their affective state. Emotional cues can be in the form of facial expressions, which are actually a combination of many distinct muscle groups within the face, or bodily postures (alone or in relation to others), or found through the interpretation of a situation or environment known to have particular emotional properties (i.e., a funeral, a wedding, a war zone, a scary alley, etc.). While the visual system is the means by which emotional information is gathered, it is the cognitive interpretation and evaluation of this information that assigns it emotional value, garners the appropriate cognitive resources, and then initiates a physiological response. This process is by no means exclusive to visual perception and in fact may overlap considerably with other modes of perception, suggesting an emotional sensory system comprised of multiple perceptual processes all of which are processed through similar channels.

Facial perception[edit]

A great deal of research conducted on emotion perception revolves around that of how people perceive emotion in others’ facial expressions. Whether the emotion contained in someone’s face is classified categorically or along dimensions of valence and arousal, the face has been proven to provide reliable cues to one’s subjective emotional state. As efficient as humans are in identifying and recognizing emotion in another’s face, accuracy goes down considerably for most emotions, with the exception of happiness, when facial features are inverted (i.e., mouth placed above eyes and nose), suggesting that a primary means of facial perception includes the identification of spatial features which resemble a prototypical face, such that two eyes are placed above a nose which is above a mouth; any other formation of features do not immediately constitute a face and require extra spatial manipulation to identify such features as resembling a face.

Discrete versus Dimensional Views[edit]

Research on the classification of perceived emotions has centered around the debate between two fundamentally distinct viewpoints. One side of the debate posits that emotions are separate and discrete entities while the other view suggests that emotions can be classified as values on the dimensions of valence (positive versus negative) and arousal (calm/soothing versus exciting/agitating). Psychologist Paul Ekman proposed the discrete emotion perspective with his groundbreaking work comparing emotion perception and expression between literate and preliterate cultures [1]. Ekman concluded that the ability to produce and perceive emotions is universal and innate and that emotions manifest categorically as basic emotions (anger, disgust, fear, happiness, sadness, contempt, surprise, and possibly contempt). Pursuing an alternative hypothesis, psychologist James Russell proposed the circumplex of emotion and described emotions as constructs which lie on the dimensions of valence and arousal and it is the combination of these values which delineate emotion [2]. Psychologist Robert Plutchik sought to reconcile these views and proposed that certain emotions be considered “primary emotions” which are grouped either positively or negatively and can then be combined to form more complex emotions, sometimes considered “secondary emotions,” such as remorse, guilt, submission, and anticipation. Plutchik created the “wheel of emotions” to outline his theory [3].

Culture[edit]

Culture plays a significant role in emotion perception, most notably in facial perception. While the features of the face convey important information, the upper (eyes/brow) and lower (mouth/nose) regions of the face have distinct qualities that can provide both consistent and conflicting information. As values, etiquette, and quality of social interactions vary across cultures, facial perception is believed to be moderated accordingly. In western cultures, where overt emotion is ubiquitous, emotional information is primarily obtained from viewing the features of the mouth, which is the most expressive part of the face. However, in eastern cultures, where overt emotional expression is less common and therefore the mouth plays a lesser role in emotional expression, emotional information is more often obtained from viewing the upper region of the face, primarily the eyes [4]. These cultural differences suggest a strong environmental and learned component in emotion perception.

Context[edit]

While much insight has been gained from learning what emotional information facial expressions have to offer, context plays an important role in both providing additional emotional information and modulating what emotion is actually perceived in a facial expression. Contexts come in three categories: stimulus- based context, in which a face is physically presented with other sensory input that has informational value; perceiver- based context, in which processes within the brain or body of a perceiver can shape emotion perception; and cultural contexts that affect either the encoding or the understanding of facial actions [5].

Auditory[edit]

In addition to information obtained through visual experience, the auditory system can also provide important emotional information obtained from the environment. Voices, screams, murmurs, and even music can convey emotional information and remarkably, these emotional interpretations of sounds alone tend to be quite consistent.

Voice[edit]

Traditionally, emotion perception in the voice has been determined through research studies analyzing, via prosodic parameters such as pitch and duration, the way in which a speaker expresses an emotion, otherwise known as encoding. Alternatively, a listener who attempts to identify a particular emotion as intended by a speaker can decode emotion. More sophisticated methods include manipulating or synthesizing important prosodic parameters in speech signal (e.g., pitch, duration, loudness, voice quality) in both natural and simulated affective speech [6]. Pitch and duration tend to have the most contribution toward emotional recognition while loudness appears to contribute little [7].

Music[edit]

Music has long been known to have emotional qualities and is a popular strategy in emotion regulation. When asked to rate emotions present in Western art music, music professionals were able to identify all six basic emotions with happiness and sadness the most represented, and in decreasing order of importance, anger, fear, surprise, and disgust [8]. In addition, the emotions of happiness, sadness, fear, and peacefulness can be perceived in a very short amount of exposure, in as little as 9-16 seconds [9] and these emotions can also be identified in short, unknown to the listener, instrumental-only music selections [10].

Olfactory[edit]

Aromas and scents have been seen to influence mood, for example through aromatherapy, and research has found that humans can extract emotional information from scents just as they can from facial expressions and emotional music. Odors may be able to exert their effects through learning and conscious perception, such that responses typically associated with particular odors are learned through association with their matched emotional experiences. In-depth research has even documented that emotion elicited by odors, both pleasant and unpleasant, affect the same physiological correlates of emotion seen through other sensory mechanisms [11]. Suggesting that “Physiological effects produced by odors are therefore simply the physiological sequelae of the psychological-emotional responses elicited by the odor, and are expected due to mind-body interactions.” [12]

Somatic[edit]

Theories on emotion have focused on perception, subjective experience, and appraisal. Predominant theories of emotion and emotion perception include what type of emotion is perceived, how emotion is perceived in the body, and at what stage of an event emotion is perceived and translated into subjective, physical experience.

James-Lange Theory of Emotion[edit]

Following the influence of René Descartes and his ideas regarding the split between body and mind, in 1884 William James proposed the theory that it is not that the human body acts in response to our emotional state, as common sense might suggest, but rather, we interpret our emotions on the basis of our already present bodily state. In the words of James, "we feel sad because we cry, angry because we strike, afraid because we tremble, and neither we cry, strike, nor tremble because we are sorry, angry, or fearful, as the case may be.” James believed it was particular and distinct physical patterns which map on to specific experienced emotions. At this same time in history, the psychologist Carl Lange arrived at the same conclusion about the experience of emotions, thus the idea that felt emotion in the result of perceiving specific patterns of bodily responses is called the James-Lange theory of emotion [13].

In support of the James-Lange theory of emotion, Silvan Tomkins proposed the facial feedback hypothesis in 1963, suggesting that facial expressions actually trigger the experience of emotions and not the other way around. This theory was tested in 1974 by James Laird in an experiment where Laird asked participants to hold a pencil either between their teeth (artificially producing a smile) or between their upper lip and their nose (artificially producing a frown) and then rate cartoons. Laird found that these cartoons were rated as being funnier by those participants holding a pencil in between their teeth. In addition, Paul Ekman recorded extensive physiological data while participants posed his basic emotional facial expressions and found that heart rate raised for sadness, fear, and anger yet did not change at al for happiness, surprise, or disgust, and skin temperature raised when participants posed anger but not other emotions. While contemporary psychologists still agree with the James-Lange theory of emotion, human subjective emotion is complex and physical reactions or antecedents do not fully explain the subjective emotional experience.

Cannon-Bard Theory of Emotion[edit]

Walter Bradford Cannon and a doctoral student of his, Philip Bard, agreed that physiological responses played a crucial role in emotions, but did not believe that physiological responses alone could explain subjective emotional experiences. He argued that physiological responses were too slow relative to the relatively rapid and intense subjective awareness of emotion and that often these emotions are similar and imperceptible to people at such a short timescale. Cannon proposed that the mind and body operate independently in the experience of emotions such that differing brain regions (cortex versus subcortex) process information from an emotion-producing stimulus independently and simultaneously resulting in both an emotional and a physical response. This is best illustrated by imagining an encounter with a grizzly bear; you would simultaneously experience fear, begin to sweat, experience an elevated heart rate, and attempt to run. All of these things would happen at the same time [14].

Two-factor Theory of Emotion[edit]

Stanley Schachter and his doctoral student Jerome Singer formulated their theory of emotion based on evidence that without an actual emotion-producing stimulus, people are unable to attribute specific emotions to their bodily states. They believed that there must be a cognitive component to emotion perception beyond that of just physical changes and subjective feelings. Schacter and Singer suggested that when someone encounters such an emotion-producing stimulus, they would immediately recognize their bodily symptoms (sweating and elevated heart rate in the case of the grizzly bear) as the emotion fear. Their theory was devised as a result of a study in which participants were injected with either a stimulant (adrenaline) which causes elevated heart rate, sweaty palms and shaking, or a placebo. Participants were then either told what the effects of the drug were or were told nothing, and were then placed in a room with a person they did not know who, according to the research plan, would either play with a hula hoop and make paper airplanes (euphoric condition) or ask the participant intimate, personal questions (angry condition). What they found was that participants who knew what the effects of the drug were attributed their physical state to the effects of the drug, however, those who had no knowledge of the drug they received attributed their physical state to the situation with the other person in the room. These results led to the conclusion that physiological reactions contributed to emotional experience by facilitating a focused cognitive appraisal of a given physiologically arousing event and that this appraisal was what defined the subjective emotional experience. Emotions were thus a result of two stage process: first, physiological arousal in a response to a evoking stimulus, and second, cognitive elaboration of the context in which the stimulus occurred [15].

Neural Bases[edit]

Emotion perception is primarily a cognitive process driven by particular brain regions believed to specialize in identifying emotional information and subsequently allocating appropriate cognitive resources to prepare the body to respond. The relationship between various regions is still unclear, but a few key regions have been implicated in particular aspects of emotion perception and processing including areas suspected of being involved in the processing of faces and emotional information.

Fusiform Face Area[edit]

The fusiform face area is an area some believe to be an area that specializes in the identification and processing of human faces, although some believe it is responsible for distinguishing between well known objects such as cars and animals. Neuroimaging studies have found activation in the area known as the fusiform face area in response to participants viewing images of prototypical faces, but not scrambled or inverted faces, suggesting that this region is specialized for processing human faces. The inability to identify faces would greatly inhibit emotion perception and processing and have significant implications involving social interactions and appropriate biological responses to emotional information.

HPA axis[edit]

The hypothalamic-pituitary-adrenal (HPA) axis plays a role in emotion perception through its mediation of the physiological stress response. This occurs through the release of hypothalamic corticotropin-releasing factor, also known as corticotropin-releasing hormone (CRH), from nerve terminals in the median eminence arising in the paraventricular nucleus, which stimulates adrenocorticotropin release from the anterior pituitary that in turn induces the release of cortisol from the adrenal cortex. This progressive process culminating in the release of glucocorticoids preparing the body to respond to environmental stimuli is believed to be initiated by the amygdala, which evaluates the emotional significance of observed phenomena. Released glucocorticoids provide negative feedback on the system and also the hippocampus which in turn regulates the shutting off of this biological stress response. It is through this response that information is encoded as emotional and bodily response is initiated, making the HPA axis an important component in emotion perception.

Amygdala[edit]

The amygdala appears to have a specific role in attention to emotional stimuli [16]. The amygdala is a small, almond-shaped region within the anterior part of the temporal lobe. Several studies of non-human primates and of patients with amygdala lesions, and studies employing functional neuroimaging techniques, have demonstrated the importance of the amygdala in face and eye-gaze identification [17]. Other studies have emphasized the importance of the amygdala for the identification of emotional expressions displayed by others, in particular threat-related emotions such as fear, but also sadness and happiness. In addition, the amygdala is involved in the response to non-facial displays of emotion, including unpleasant auditory, olfactory and gustatory stimuli, and in memory for emotional information [18]. The amygdala receives information from both the thalamus and the cortex; information from the thalamus is rough in detail and the amygdala receives this very quickly, while information from the cortex is much more detailed but is received more slowly. [19]. In addition, the amydala’s role in attention modulation toward emotion-specific stimuli may occur via projections from the central nucleus of the amygdala to cholinergic neurons, which lower cortical neuronal activation thresholds and potentiate cortical information processing [20].

Disordered Emotion Perception[edit]

There is great individual difference in emotion perception and certain groups of people display abnormal processes. Some disorders are in part classified by maladaptive and abnormal emotion perception while others, such as mood disorders, exhibit mood-congruent emotional processing. Whether abnormal processing leads to the exacerbation of certain disorders or is the result of these disorders is yet unclear, difficulties or deficits in emotion perception are common among various disorders.

Autism[edit]

Research on individuals with autism have found that autism is associated with impaired face discrimination and recognition in addition to atypical processing strategies, such that people with autism employ a piecemeal strategy as opposed to configural and have reduced attention to the eyes relative to individuals without autism. These impairments have been found in people with autism as early as 3 years of age [21]. Normally developing individuals have been found to have a processing speed advantage for faces over non-facial stimuli as faces provide nonverbal information important for survival [22], however, research on emotional face perception in autistic individuals has documented no such advantage in addition to abnormal cortical specialization for face processing. As symptoms of autism include deficits in social functioning, the early identification of irregular face processing may be one of the earliest identifiers of symptoms of autism [23]. Although deficits exist, individuals with autism have also been found to have better memory for the lower half of a face and have increased abilities in identifying partly obscured faces as well as isolated facial features [24] [25]. It has been hypothesized that people with autism have a deficit in social motivation [26], leading to a reduction in social experience, including that of facial processing, which in turn may lead to abnormal cortical specialization for faces and decreased processing efficiency.

Schizophrenia[edit]

Individuals with schizophrenia have difficulty with all types of facial emotion expression perception [27] and indeed, facial perception more generally [28] which affects quality of life and psychosocial functioning. People with schizophrenia do not scan faces in the same manner as people without schizophrenia, most notably, people with schizophrenia tend to ignore key facial features containing important emotional information and have trouble in both identifying emotional faces and discriminating between two emotional faces [29]. It has been noted that one of the primary deficits in emotion perception in people with schizophrenia is that they have difficulty incorporating contextual information in making affective decisions leading to the loss of important emotional information [30]. While the majority of schizophrenic patients display these deficits in emotion perception, individuals with paranoid type schizophrenia have been shown to be less impaired than those with other subtypes. Preliminary evidence also suggests greater impairment in emotion perception among individuals with later onset schizophrenia compared to earlier onset. Neuropathological and structural neuroimaging studies of these patients have demonstrated abnormal neuronal cell integrity and volume reductions in the amygdala, insula, thalamus and hippocampus [31] studies employing functional neuro-imaging techniques have demonstrated a failure to activate limbic regions in response to emotive stimuli [32].

Depression[edit]

In patients with major depressive disorder, studies have demonstrated either generalized or specific impairments in the identification of emotional facial expressions, or a bias towards the identification of expressions as sad [33]. Neuro-pathological and structural neuroimaging studies in patients with major depressive disorder have indicated abnormalities within the subgenual anterior cingulate gyrus and volume reductions within the hippocampus, ventral striatal regions and amygdala [34].

Anxiety[edit]

Anxiety has been commonly associated with individuals being able to perceive threat when in fact none is present [35], and orient more quickly to threatening cues than other cues [36]. While the ability to perceive threatening information in one’s environment is typically considered adaptive, due to anxious individual’s tendency to recognize threat in others quickly and spend excessive attention on such threat, they increase their anxiety quickly and have difficulty regulating their emotion. The time-course of anxiety’s impact on the detection of threatening stimuli has been long debated with some research demonstrating an enhanced orienting toward threat [37], others documenting a late-stage attention maintenance toward threat [38], while still others attempting to reconcile this view finding what is termed vigilance-avoidance, or early-stage enhanced orienting toward threat and later-stage avoidance of such material [39].

Posttraumatic stress disorder[edit]

As a form of anxiety, post-traumatic stress disorder (PTSD) has been linked with abnormal attention toward threatening information, in particular, threatening stimuli which relates to the personally relevant trauma, making such a bias in that context appropriate, but out of context, maladaptive [40]. Such processing of emotion can alter an individuals ability to accurately assess others’ emotions as well.

Adverse Life Experience[edit]

Child maltreatment and child abuse have been associated with emotion processing biases, most notably toward the experience-specific emotion of anger [41]. Due to suffered abuse, it is believed to be adaptive to attend to angry emotion as this may be a precursor to danger and harm and quick identification of even mild anger cues can facilitate the ability for a child to escape the situation [42]. Some research has found that abused children exhibit attention biases toward angry faces [43] [44] such that they tend to interpret even ambiguous faces as angry versus other emotions [45] and have a difficulty disengaging from such expressions [46] while other research has found abused children to demonstrate an attentional avoidance of angry faces [47]. These seemingly adaptive biases are only adaptive in anger-specific situations, but such biases are considered maladaptive when anger is over-identified and psychopathology may result of such disordered attention.

Research Methods[edit]

Researchers employ several methods designed to examine biases toward emotional stimuli to determine the salience of particular emotional stimuli, population differences in emotion perception, and also attentional biases toward or away from emotional stimuli. Tasks commonly utilized in research experiments include the modified stroop task, the dot probe task, visual search tasks, and spatial cuing tasks.

Modified Stroop task[edit]

The modified Stroop task displays different types of words (e.g., threatening and neutral) in varying colors. The participant is then asked to identify the color of the word while ignoring the actual semantic content. Increased response time to indicate the color of threat words relative to neutral words suggests an attentional bias toward such threat (Stroop, 1935). The Stroop task, however, has some interpretational difficulties such as the fact that delayed response to threat words could indicate both enhanced attention and delayed response to threat [48] in addition to the lack of allowance for the measurement of spatial attention allocation [49].

Dot Probe[edit]

The dot probe task is one means of addressing some of the limitations of the Stroop task. The dot probe task two words or pictures on a computer screen (either one at the top or left and the other on the bottom or right, respectively). Presentation of stimuli is displayed briefly, often less than 1000ms, and after disappearance, a probe appears in the location of one of the stimuli and participants are asked to press a button indicating the location of the probe. Different response times between target (e.g., threat) and neutral stimuli infer attentional biases to the target information with shorter response times for when the probe is in the place of the target stimuli indicating an attention bias for that type of information [50].

Visual search task[edit]

In another task which examines spatial attentional allocation, the visual search task asks participants to detect a target stimulus embedded in a matrix of distractors (e.g., an angry face among several neutral or other emotional faces). Conversely, neutral targets can be targeted while embedded in emotional stimuli which serve as distractors. Faster detection times to find emotional stimuli among neutral stimuli or slower detection times to find neutral stimuli among emotional distractors infer an attentional bias for such stimuli [51] [52].

Spatial cueing task[edit]

The spatial cuing task asks participants to focus on a point located between two rectangles at which point a cue is presented, either in the form of one of the rectangles lighting up or some emotional stimuli appearing within one of the rectangles. Participants then press a button indicating the location of the target stimuli. Some trials are considered valid, in which the cue draws attention to the actual location of the target, while others are invalid, directing attention to the location opposite where the target will appear. Faster response times to valid, target-related cues relative to neutral-related cues indicates an attention bias toward such stimuli. In addition, slower response times to invalid target-related cues also indicate an attentional bias toward target stimuli [53] [54].

See also[edit]

References[edit]

  1. ^ Ekman, P. (1993). Facial expression of emotion. American Psychologist, 48, 384–392.
  2. ^ Russell, J.A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39, 1161–1178.
  3. ^ Plutchik , R. (2002). Nature of emotions. American Scientist, 89, 349.
  4. ^ Yuki, M., Maddux, W. W., & Masuda, T. (2007). Are the windows to the soul the same in the East and West? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States. Journal of Experimental Social Psychology, 43(2), 303–311. doi:10.1016/j.jesp.2006.02.004
  5. ^ Barrett, L. F., Mesquita, B., & Gendron, M. (2011). Context in Emotion Perception. Current Directions in Psychological Science, 20(5), 286–290. doi:10.1177/0963721411422522.
  6. ^ Cummings, K.E., & Clements, M.A. (1995). Analysis of the glottal excitation of emotionally styled and stressed speech. Journal of the Acoustical Society of America, 98, 88± 98.
  7. ^ Frick, R.W. (1985). Communicating emotion: the role of prosodic features. Psychological Bulletin, 97, 412± 429.
  8. ^ Kallinen, K. (2005). Emotional ratings of music excerpts in the Western art music repertoire and their self-organization in the Kohonen Neural Network. Psychology of Music, 33(4), 373–393.
  9. ^ Vieillard, S., Peretz, I., Gosselin, N., Khalfa, S., Gagnon, L., & Bouchard, B. (2008). Happy, sad, scary and peaceful musical excerpts for research on emotions. Cognition & Emotion, 22(4), 720–752.
  10. ^ Mohn, C., Argstatter, H., & Wilker, F. W. (2011). Perception of six basic emotions in music. Psychology of Music, 39(4), 503–517. doi:10.1177/0305735610378183.
  11. ^ Alaoui-Ismaili, O., Robin, O., Rada, H., Dittmar, A., & Vernet-Maury, E. (1997). Basic emotions evoked by odorants: Comparison between autonomic responses and self-evaluation. Physiology & Behavior, 62, 713–720.
  12. ^ Herz, R. S. (2009). Aromatherapy Facts and Fictions: A Scientific Analysis of Olfactory Effects on Mood, Physiology and Behavior. International Journal of Neuroscience, 119(2), 263–290. doi:10.1080/00207450802333953
  13. ^ Cannon, Walter B. (1927). "The James-Lange theory of emotion: A critical examination and an alternative theory". The American Journal of Psychology. 39: 106–124.
  14. ^ Cannon, Walter B. (1929). "Organization for Physiological Homeostasis". Physiological review. 9 (3): 399–421.
  15. ^ Schachter, S., & Singer, J. (1962). Cognitive, Social, and Physiological Determinants of Emotional State. Psychological Review, 69, pp. 379–399
  16. ^ Davis M, Whalen PJ (2001): The amygdala: Vigilance and emotion. Mol Psychiatry 6:3–34.
  17. ^ Davis M, Whalen PJ (2001): The amygdala: Vigilance and emotion. Mol Psychiatry 6:3–34.
  18. ^ Calder AJ, Lawrence AD, Young AW (2001): Neuropsychology of fear and loathing. Nat Rev Neurosci 2:352–363.
  19. ^ LeDoux, J. E. (1995). Emotion: Clues from the brain. Annual Review of Psychology, 46(1), 209–235.
  20. ^ Whalen PJ (1998): Fear, vigilance, and ambiguity: Initial neuro- imaging studies of the human amygdala. Curr Directions Psychol Sci 7:177–188.
  21. ^ Dawson, G., Webb, S. J., & McPartland, J. (2005). Understanding the nature of face processing impairment in autism: insights from behavioral and electrophysiological studies. Developmental Neuropsychology, 27(3), 403–424.
  22. ^ Darwin, C. (1965). The expression of the emotions in man and animals. London: Murray. (Original work published 1872)
  23. ^ Osterling, J., & Dawson, G. (1994). Early recognition of children with autism: A study of first birthday home videotapes. Journal of Autism & Developmental Disorders, 24, 247–257.
  24. ^ Tantam, D., Monoghan, L., Nicholson, H., & Stirling, J. (1989). Autistic children’s ability to interpret faces: A research note. Journal of Child Psychology and Psychiatry, 30, 623–630.
  25. ^ Langdell, T. (1978). Recognition of faces: An approach to the study of autism. Journal of Child Psy- chology and Psychiatry and Allied Disciplines, 19, 255–268.
  26. ^ Dawson, G., Carver, L., Meltzoff, A., Panagiotides, H., McPartland, J., & Webb, S. (2002). Neural cor- relates of face and object recognition in young children with autism spectrum disorder, developmen- tal delay, and typical development. Child Development, 73, 700–717.
  27. ^ Kohler, C. G., Walker, J. B., Martin, E. A., Healey, K. M., & Moberg, P. J. (2010). Facial Emotion Perception in Schizophrenia: A Meta-analytic Review. Schizophrenia Bulletin, 36(5), 1009–1019.
  28. ^ Kerr, S. L., & Neale, J. M. (1993). Emotion perception in schizophrenia: Specific deficit or further evidence of generalized poor performance? Journal of Abnormal Psychology, 102, 312–318.
  29. ^ Kohler, C. G., Walker, J. B., Martin, E. A., Healey, K. M., & Moberg, P. J. (2010). Facial Emotion Perception in Schizophrenia: A Meta-analytic Review. Schizophrenia Bulletin, 36(5), 1009–1019.
  30. ^ Kring, A. M., & Campellone, T. R. (2012). Emotion Perception in Schizophrenia: Context Matters. Emotion Review, 4(2), 182–186. doi:10.1177/1754073911430140
  31. ^ Wright, I. C., Rabe-Hesketh, S.,Woodruff, P.W. R., et al (2000) Regional brain structure in schizophrenia: a meta-analysis of volumetric MRI studies. American Journal of Psychiatry, 157, 16^25.
  32. ^ Crespo-Facorro, B., Paradiso, S., Andreasen, N. C., et al (2001) Neural mechanisms of anhedonia in schizophrenia. JAMA, 286, 427^435.
  33. ^ Gur, R. C., Erwin, R. J., Gur, R. E., et al (1992) Facial emotion discrimination. II: Behavioral findings in depression. Psychiatric Research, 42, 241^251.
  34. ^ Phillips, M. L. (2003). Understanding the neurobiology of emotion perception: implications for psychiatry. The British Journal of Psychiatry, 182(3), 190–192. doi:10.1192/bjp.02.185
  35. ^ Richards, A., French, C. C., Calder, A. J., Webb, B., Fox, R., & Young, A. W. (2002). Anxiety-related bias in the classification of emotionally ambiguous facial expressions. Emotion, 2, 273–287.
  36. ^ Bar-Haim, Y., Lamy, D., Pergamin, L., Bakermans-Kranenburg, M. J., & van IJzendoorn, M. H. (2007). Threat-related attentional bias in anxious and nonanxious individuals: A meta-analytic study. Psychological Bulletin, 133, 1–24. doi:10.1037/0033-2909.133.1.1
  37. ^ Williams, J. M. G., Watts, F. N., MacLeod, C., & Matthews, A. (1988). Cognitive psychology and emotional disorders. Chichester, England:Wiley.
  38. ^ Foa, E. B., & Kozak, M. J. (1986). Emotional processing of fear: Exposure to corrective information. Psychological Bulletin, 99, 20–35.
  39. ^ Amir, N., Foa, E. B., & Coles, M. E. (1998). Automatic activation and strategic avoidance of threat-relevant information in social phobia. Jour- nal of Abnormal Psychology, 107, 285–290.
  40. ^ Buckley, T. C., Blanchard, E. B., & Neill, W. T. (2000). Information processing and PTSD: A review of the empirical literature. Clinical Psychology Review, 28(8), 1041–1065.
  41. ^ Cicchetti, D., Toth, S. L., & Maughan, A. (2000). An ecological- transactional model of child maltreatment. In A. J. Sameroff, M. Lewis, & S. M. Miller (Eds.), Handbook of developmental psychopathology (2nd ed., pp. 689-722). New York: Kluwer Academic/Plenum.
  42. ^ Pollak, S. D. (2003). Experience-dependent affective learning and risk for psychopathology in children. In J. A. King, C. F. Ferris, & I. I. Lederhendler (Eds.), Roots of mental illness in children (pp. 102-111). New York: Annals of the New York Academy of Sciences.
  43. ^ Pine, D. S., Mogg, K., Bradley, B. P., Montgomery, L., Monk, C. S., McClure, E., et al. (2005). Attention bias to threat in maltreated children: Implications for vulnerability to stress-related psychopathology. American Journal of Psychiatry, 162, 291-296.
  44. ^ Pollak, S. D., & Tolley-Schell, S. A. (2003). Selective attention to facial emotion in physically abused children. Journal of Abnormal Psychology, 112, 323-338.
  45. ^ Pollak, S. D., & Kistler, D. J. (2002). Early experience is associated with the development of categorical representations for facial expressions of emotion. Proceedings of the National Academy of Sciences, 99, 9072-9076.
  46. ^ Pollak, S. D. (2003). Experience-dependent affective learning and risk for psychopathology in children. In J. A. King, C. F. Ferris, & I. I. Lederhendler (Eds.), Roots of mental illness in children (pp. 102-111). New York: Annals of the New York Academy of Sciences.
  47. ^ Pine, D. S., Mogg, K., Bradley, B. P., Montgomery, L., Monk, C. S., McClure, E., et al. (2005). Attention bias to threat in maltreated children: Implications for vulnerability to stress-related psychopathology. American Journal of Psychiatry, 162, 291-296.
  48. ^ Algom, D., Chajut, E., & Lev, S. (2004). A rational look at the emotional Stroop paradigm: A generic slowdown, not a Stroop effect. Journal of Experimental Psychology: General, 133, 323−338.
  49. ^ MacLeod, C., Mathews, A., & Tata, P. (1986). Attentional bias in the emotional disorders Journal of Abnormal Psychology, 95, 15−20.
  50. ^ MacLeod, C., Mathews, A., & Tata, P. (1986). Attentional bias in the emotional disorders Journal of Abnormal Psychology, 95, 15−20.
  51. ^ Öhman, A., Flykt, A., & Esteves, F. (2001). Emotion drives attention: Detecting the snake in the grass. Journal of Experimental Psychology: General, 130, 466−478.
  52. ^ Rinck, M., Becker, E. S., Kellermann, J., & Roth, W. T. (2003). Selective attention in anxiety: Distraction and enhancement in visual search. Depression and Anxiety, 18, 18−28.
  53. ^ Fox, E., Russo, R., Bowles, R., & Dutton, K. (2001). Do threatening stimuli draw or hold visual attention in subclinical anxiety? Journal of Experimental Psychology: General, 130, 681−700.
  54. ^ Posner, M. I. (1980). Orienting of attention. Quarterly Journal of Experimental Psychology, 32, 3−25.

External links[edit]