Jump to content

User:Redtotebag/Social cue

From Wikipedia, the free encyclopedia

Article Draft[edit]

Lead[edit]

Article Plan:

  1. Reorganize the article by switching the order of the mechanisms and nonverbal cue sections.
  2. Add missing citation in facial cues section about gaze cues and overgeneralization.
  3. Add more information about why we use nonverbal cues because it felt like it could be more detailed in topic like gaze cues.
  4. Add a few more examples of motion cues and description of them in order to provide more detail.

Mechanisms[edit][edit]

Recent work done in the field studying social cues has found that perception of social cues is best defined as the combination of multiple cues and processing streams, also referred to as cue integration. Stimuli are processed through experience sharing and mentalizing and the likelihood of the other person's internal state is inferred by the Bayesian logic. Experience sharing is a person's tendency to take on another person's facial expressions, posture and internal state and is often related to the area of empathy. A stimulus that is perceptually salient can cause a person to automatically use a bottom-up approach or cognitive top-down intentions or goals. This causes one to move in a controlled and calculated manner. A peripheral cue is used to measure spatial cuing, which does not give away information about a target's location. Naturally, only the most relevant contextual cues are processed and this occurs extremely fast (approx. 100-200 milliseconds). This type of fast, automating processing is often referred to as intuition and allows us to integrate complex multi-dimensional cues and generate suitable behaviour in real time.

Cognitive learning models illustrate how people connect cues with certain outcomes or responses. Learning can strengthen associations between predictive cues and outcomes and weaken the link between nondescriptive cues and outcomes. Two aspects of the EXIT model learning phenomena have been focused on by Collins et al. The first is blocking which happens when a new cue is introduced with a cue that already has meaning. The second is highlighting which happens when an individual pays close attention to a cue that will change the meaning of a cue that they already know. When a new cue is added along with a previous one it is said that individuals only focus on the new cue to gain a better understanding as to what is going on.

Brain regions involved in processing[edit][edit]

Benjamin Straube, Antonia Green, Andreas Jansen, Anjan Chatterjee, and Tilo Kircher found that social cues influence the neural processing of speech-gesture utterances. Past studies have focused on mentalizing as being a part of perception of social cues and it is believed that this process relies on the neural system, which consists of:

When people focus on things in a social context, the medial prefrontal cortex and precuneus areas of the brain are activated, however when people focus on a non-social context there is no activation of these areas. Straube et al. hypothesized that the areas of the brain involved in mental processes were mainly responsible for social cue processing. It is believed that when iconic gestures are involved, the left temporal and occipital regions would be activated and when emblematic gestures were involved the temporal poles would be activated. When it came to abstract speech and gestures, the left frontal gyrus would be activated according to Straube et al. After conducting an experiment on how body position, speech and gestures affected activation in different areas of the brain Straube et al. came to the following conclusions:

  1. When a person is facing someone head on the occipital, inferior frontal, medial frontal, right anterior temporal and left hemispheric parietal cortex were activated.
  2. When participants watched an actor who was delivering a speech talking about another person an extended network of bilateral temporal and frontal regions were activated.
  3. When participants watched an actor who talked about objects and made iconic gestures the occipito-temporal and parietal brain areas were activated. The conclusion that Straube et al. reached was that speech-gesture information is effected by context-dependent social cues.

The amygdala, fusiform gyrus, insula, and superior and middle temporal regions have been identified as areas in the brain that play a role in visual emotional cues. It was found that there was greater activation in the bilateral anterior superior temporal gyrus and bilateral fusiform gyrus when it came to emotional stimuli. The amygdala has been connected with the automatic evaluation of threat, facial valence information, and trustworthiness of faces.

When it comes to visual cues, individuals follow the gaze of others to find out what they are looking at. It has been found that this response is evolutionarily adaptive due to the fact that it can alert others to happenings in the environment. Almost 50% of the time, peripheral cues have a hard time finding the location of a target.[clarification needed] Studies have shown that directed gaze impacts attentional orienting in a seemingly automatic manner. Part of the brain that is involved when another person averts their gaze is also a part of attentional orienting. Past researchers have found that arrow cues are linked to the fronto-parietal areas, whereas arrow and gaze cues were linked to occipito-temporal areas. Therefore, gaze cues may indeed rely on automatic processes more than arrow cues. The importance of eye gaze has increased in importance throughout the evolutionary time period.

Higher level visual regions, such as the fusiform gyrus, extrastriate cortex and superior temporal sulcus (STS) are the areas of the brain which studies have found to link to perceptual processing of social/biological stimuli. Behavioral studies have found that the right hemisphere is highly connected with the processing of left visual field advantage for face and gaze stimuli. Researchers believe the right STS is also involved in using gaze to understand the intentions of others. While looking at social and nonsocial cues, it has been found that a high level of activity has been found in the bilateral extrastriate cortices in regards to gaze cues versus peripheral cues. There was a study done on two people with split-brain, in order to study each hemisphere to see what their involvement is in gaze cuing. Results suggest that gaze cues show a strong effect with the facial recognition hemisphere of the brain, compared to nonsocial cues. The results of Greene and Zaidel's study suggest that in relation to visual fields, information is processed independently and that the right hemisphere shows greater orienting.

Pertaining to emotional expression the superior temporal cortex has been shown to be active during studies focusing on facial perception. However, when it comes to face identity the inferior temporal and fusiform cortex is active. During facial processing the amygdala and fusiform gyrus show a strong functional connection. Face identification can be impaired if there is damage to the orbitofrontal cortex (OFC). The amygdala is active during facial expressions and it improves long-term memory for long term emotional stimuli. It has also been found that there are face response neurons in the amygdala. The connection between the amygdala, OFC, and other medial temporal lobe structures  suggest that they play an important role in working memory for social cues. Systems which are critical in perceptually identifying and processing emotion and identity need to cooperate in order to maintain maintenance of social cues.

In order to monitor changing facial expressions of individuals, the hippocampus and orbitofrontal cortex may be a crucial part in guiding critical real-world social behavior in social gatherings. The hippocampus may well be a part of using social cues to understand numerous appearances of the same person over short delay periods. The orbitofrontal cortex being important in the processing of social cues leads researchers to believe that it works with the hippocampus to create, maintain, and retrieve corresponding representations of the same individual seen with multiple facial expressions in working memory. After coming across the same person multiple times with different social cues, the right lateral orbitofrontal cortex and hippocampus are more strongly employed and display a stronger functional connection when disambiguating each encounter with that individual. During an fMRI scan the lateral orbitofrontal cortex, hippocampus, fusiform gyrus bilaterally showed activation after meeting the same person again and having previously seen two different social cues. This would suggest that both of these brain areas help retrieve correct information about a person's last encounter with the person. The ability to separate the different encounters with different people seen with different social cues leads researchers to believe that it permits for suitable social interactions. Ross, LoPresti and Schon offer that the orbitofrontal cortex and hippocampus are a part of both working memory and long-term memory, which permits flexibility in encoding separate representations of an individual in the varying social contexts in which we encounter them.

Oxytocin has been named "the social hormone". Research done on rats provide strong evidence that social contact enhances oxytocin levels in the brain which then sets the stage for social bonds. In recent years it has been found that inhaling oxytocin through the nasal passage increases trust toward strangers and increases a person's ability to perceive social cues. Activation of face-induced amygdala was found to be increased by oxytocin in women. There have been findings that oxytocin increases occurrence of attention shifts to the eye region of a face which suggests that it alters the readiness of the brain to socially meaningful stimuli. Dopamine neuron from the ventral tegmental area[clarification needed] codes the salience of social as well as nonsocial stimuli. Bartz et al. found that the effects of oxytocin are person-dependent, meaning that every individual will be affected differently by oxytocin, especially those who have trouble in social situations. Research done by Groppe et al. supports that motivational salience of social cues is enhanced by oxytocin. Oxytocin has been found to increase cues[clarification needed] that are socially relevant.

Examples of social cues[edit][edit]

Nonverbal cues[edit][edit]

Roles of nonverbal cues[edit]

Nonverbal communication is any sort of communication based on facial expressions, body language, and any vocal communication that doesn't use words. Nonverbal cues consist of anything you do with your face, body or nonlinguistic voice that you others can and may respond to. [1] The main role of nonverbal cues is communication. These types of cues can help us connect with others and communicate emotions, moods, instructions, and many other things based on facial cues, motion cues and body language.[2] Understanding and using nonverbal cues can also help people not only in day to day life but in situations such as interviews, leadership roles, service roles, educational roles and more. [2]

Facial cues[edit][edit]

Facial expressions are signals that we make by moving our facial muscles on our face. Facial expressions generally signify an emotional state, and each emotional state and/or[clarification needed] state of mind has a specific facial expression, many of which are universally used around the world. Without seeing someone's facial expression, one would not be able to see if the other person is crying, happy, angry, etc. Furthermore, facial expressions enable us to further comprehend what is going on during situations that are very difficult or confusing.

Facial cues do not only refer to explicit expressions but also include facial appearance. There is a wealth of information that people gather simply from a person's face in the blink of an eye, such as gender, emotion, physical attractiveness, competence, threat level and trustworthiness. One of the most highly developed skills that humans have is facial perception. The face is one of the greatest representations of a person. A person's face allows others to gain information about that person, which is helpful when it comes to social interaction. The fusiform face area of the human brain plays a large role in face perception and recognition; however, it does not provide useful information[further explanation needed] for processing emotion recognition, emotional tone, shared attention, impulsive activation of person knowledge and trait implications based on facial appearance.

The fallacy of making inferences about people's personality traits from their facial appearance is referred to as overgeneralization effect.[3] For instance, baby face overgeneralization produces the biased perception that people whose facial features resemble those of children have childlike traits (e.g. weakness, honesty, need to be protected); and an attractive face leads to judgements tied to positive personality traits such as social competency, intelligence, and health.[clarification needed] It is mainly facial features which resemble low fitness (anomalous face overgeneralization), age (baby face overgeneralization), emotion (emotion face overgeneralization) or a particular identity (familiar face overgeneralization) that affect impression formation; and even a trace of these qualities can lead to such a response. These effects are prevalent in spite of a general awareness that those impressions most likely do not represent a person's true character.

An important tool for communication in social interactions are the eyes. Gaze cues are the most informative social stimulus as they are able to convey basic emotions (e.g. sadness, fear) and reveal a lot about a person's social attention. Infants that are already 12 months old respond to the gaze of adults, which indicates that the eyes are an important way to communicate, even before spoken language is developed. [4]Eye gaze direction conveys a person's social attention; and eye contact can guide and capture attention as well as act as a signal of attraction. People must detect and orient to people's eyes in order to utilize and follow gaze cues. Real-world examples show the degree to which we seek and follow gaze cues may change contingent on how close the standard is to a real social interaction.[clarification needed] People may use gaze following because they want to avoid social interactions. Past experiments have found that eye contact was more likely to occur when a speaker's face was available, for longer periods of real-world time.[clarification needed] Individuals use gaze following and seeking to provide information for gaze cuing when information is not provided in a verbal manner. However, people do not seek gaze cues when they are not provided or when spoken instructions contain all of the relevant information.

Motion cues[edit][edit]

See also: Posture (psychology), Gesture, and Gestures in language acquisition

Body language and body posture are other social cues that we use to interpret how someone else is feeling. Other than facial expressions, body language and posture are the main non-verbal social cues that we use. For instance, body language can be used to establish personal space, which is the amount of space needed for oneself in order to be comfortable. Taking a step back can therefore be a social cue indicating a violation of personal space.

People pay attention to motion cues even with other visual cues (e.g. facial expression) present. Already brief[clarification needed] displays of body motion can influence social judgements or inferences regarding a person's personality, mating behaviour, and attractiveness. For example, a high amplitude of motion might indicate extraversion and vertical movements might form an impression of aggression.

Gestures are specific motions that one makes with the hands in order to further communicate a message. Certain gestures such as pointing gestures, gazes and nods can help direct people's focus to what is going on around them that is important. Not only does using gestures help the speaker to better process what they are saying, but it also helps the listener to better comprehend what the speaker is saying.


References[edit]

  1. ^
  2. ^ Jump up to:a b c
  3. ^ Jump up to:a b
  4. ^
  5. ^
  6. ^ Jump up to:a b
  7. ^ Jump up to:a b
  8. ^
  9. ^ Jump up to:a b
  10. ^
  11. ^
  12. ^ Jump up to:a b
  13. ^  n
  14. ^
  15. ^
  16. ^
  17. ^ Jump up to:a b c
  18. ^
  19. ^
  20. ^
  21. ^ Jump up to:a b
  22. ^
  23. ^ Jump up to:a b
  24. ^
  25. ^
  26. ^
  27. ^
  28. ^
  29. ^
  30. ^
  31. ^
  32. ^
  33. ^
  34. ^
  35. ^
  36. ^
  37. ^
  38. ^
  39. ^
  40. ^
  41. ^
  42. ^
  43. ^
  44. ^
  45. ^
  46. ^
  47. ^
  48. ^
  49. ^
  50. ^
  51. ^ Jump up to:a b
  52. ^
  53. ^ Jump up to:a b
  54. ^
  55. ^
  56. ^
  57. ^ Jump up to:a b c d
  58. ^
  59. ^
  60. ^
  61. ^
  62. ^
  63. ^
  64. ^
  65. ^
  66. ^ Jump up to:a b
  67. ^ Jump up to:a b c d e f g
  68. ^
  69. ^
  70. ^
  1. ^ Hall, Judith A.; Horgan, Terrence G.; Murphy, Nora A. (2019-01-04). "Nonverbal Communication". Annual Review of Psychology. 70 (1): 271–294. doi:10.1146/annurev-psych-010418-103145. ISSN 0066-4308.
  2. ^ a b Carmichael, Cheryl L.; Mizrahi, Moran (2023-10-01). "Connecting cues: The role of nonverbal cues in perceived responsiveness". Current Opinion in Psychology. 53: 101663. doi:10.1016/j.copsyc.2023.101663. ISSN 2352-250X.
  3. ^ Zebrowitz, Leslie A.; Rhodes, Gillian (2004-09-01). "Sensitivity to "Bad Genes" and the Anomalous Face Overgeneralization Effect: Cue Validity, Cue Utilization, and Accuracy in Judging Intelligence and Health". Journal of Nonverbal Behavior. 28 (3): 167–185. doi:10.1023/B:JONB.0000039648.30935.1b. ISSN 1573-3653.
  4. ^ Brooks, Rechele; Meltzoff, Andrew N. (2002-11). "The Importance of Eyes: How Infants Interpret Adult Looking Behavior". Developmental psychology. 38 (6): 958–966. doi:10.1037//0012-1649.38.6.958. ISSN 0012-1649. PMC 1351351. PMID 12428707. {{cite journal}}: Check date values in: |date= (help)