Social cues are verbal or non-verbal signals expressed through the face, body, voice, motion (and more) and guide conversations as well as other social interactions by influencing our impressions of and responses to others. These percepts are important communicative tools as they convey important social and contextual information and therefore facilitate social understanding.
A few examples of social cues include:
Social cues are part of social cognition and serve several purposes in navigating the social world. Due to their social nature, humans rely heavily on the ability to understand other peoples’ mental states and make forecasts about their behaviour. Especially in the view of evolution, this ability is critical to determine potential threats or advantageous opportunities as well as to form and maintain relationships in order to fulfil safety and basic physiologic needs. These cues allow us to predict other people's meanings and intentions in order to be able to respond in an efficient and adaptive manner as well as anticipate how others might respond to one’s own choices. For instance, people are found to behave more prosocial in economic games when being watched which indicates potential reputational risk (watching eye effect).
The ability to perceive social signals and integrate it into judgements about other’s intentional mental states (e.g., beliefs, desires, emotions, knowledge) is often referred to as theory of mind or mentalization and is already evident from ca. 18 months of age.
Processing and decoding social cues is an important part of everyday human interaction (e.g. turn-taking in conversation) and therefore a critical skill for communication and social understanding. Taking into account other people’s internal states such as thoughts or emotions is a critical part of forming and maintaining relationships. The social monitoring system attunes individuals to external information regarding social approval and disapproval by increasing interpersonal sensitivity, the “attention to and accuracy in decoding interpersonal social cues” relevant to gaining inclusion. Being able to detect both positive and negative cues accurately allows to behave adaptively and avoid future rejection which therefore produces greater social inclusion. High need for social inclusion due to situational events (e.g. rejection) activates higher social monitoring and individuals that generally experience greater belonging needs are associated with greater interpersonal sensitivity. However, this mechanism should not to be confused with rejection sensitivity which reflects a bias that decodes ambiguous social cues as signs of rejection.
Under-developed awareness of social cues can make interaction in social situations challenging. There are various mental disorders like schizophrenia that impair this ability and therefore make effective communication as well as forming relationships with others difficult for the affected person. Additionally, research shows that older adults have difficulties in extracting and decoding social cues from the environment, especially about human agency and intentionality. Children rely more on social cues than adults as they use them in order to comprehend and learn about their surroundings.
[ edit ]
Nonverbal cues [ edit ]
Facial cues [ edit ]
Facial expressions are signals that we make by moving our facial muscles on our face. Facial expressions generally signify an emotional state, and each emotional state and/or state of mind has a specific facial expression, many of which are universally used around the world. Without seeing someone's facial expression, one would not be able to see if the other person is crying, happy, angry, etc. Furthermore, facial expressions enable us to further comprehend what is going on during situations that are very difficult and/or confusing.
Facial cues do not only refer to explicit expressions but also include facial appearance. There is a wealth of information that people gather simply from a person's face in the blink of an eye, such as gender, emotion, physical attractiveness, competence, threat level and trustworthiness. One of the most highly developed skills that humans have is facial perception. The face is one of the greatest representations of a person. A person's face allows others to gain information about that person, which is helpful when it comes to social interaction. The fusiform face area of the human brain plays a large role in face perception and recognition; however it does not provide useful information for processing emotion recognition, emotional tone, shared attention, impulsive activation of person knowledge and trait implications based on facial appearance.
The fallacy of making inferences about people’s personality traits from their facial appearance is referred to as overgeneralisation effect. For instance, baby face overgeneralisation produces the biased perception that people whose facial features resemble those of children have childlike traits (e.g. weakness, honesty, need to be protected) and an attractive face leads to judgements tied to positive personality traits like being socially competent, intelligent and healthy. Mainly facial features resembling low fitness (anomalous face overgeneralisation), age (baby face overgeneralisation), emotion (emotion face overgeneralisation) or a particular identity (familiar face overgeneralisation) affect impression formation and even a trace of these qualities can lead to such response. These effects are prevalent in spite of a general awareness that those impressions most likely do not represent a person’s true character.
An important tool for communication in social interactions is the eyes. Gaze cues are the most informative social stimulus as they are able to convey basic emotions (e.g. sadness, fear) and reveal a lot about a person's social attention. Already 12-month-old infants respond to the gaze of adults which indicates that the eyes are an important way to communicate, even before spoken language is developed. Eye gaze direction conveys a person's social attention and eye contact can guide and capture attention as well as act as a signal of attraction. People must detect and orient to people's eyes in order to utilize and follow gaze cues. Real-world examples show the degree to which we seek and follow gaze cues may change contingent on how close the standard is to a real social interaction. People may use gaze following because they want to avoid social interactions. Past experiments have found that eye contact was more likely when there was a speaker's face available, for longer periods of real-world time. Individuals use gaze following and seeking to provide information for gaze cuing when information is not provided in a verbal manner. However, people do not seek gaze cues when they are not provided or when spoken instructions contain all of the relevant information.
Motion cues [ edit ]
Body language and body posture are other social cues that we use to interpret how someone else is feeling. Other than facial expressions, body language and posture are the main non-verbal social cues that we use. For instance, body language can be used to establish personal space, which is the amount of space needed for oneself in order to be comfortable. Taking a step back can therefore be a social cue indicating a violation of personal space.
People pay attention to motion cues even with other visual cues (e.g. facial expression) present. Already brief displays of body motion can influence social judgements regarding inferences about a person's personality, mating behaviour as well as judgements of attractiveness. For example, a high amplitude of motion might indicate extraversion and vertical movements might form an impression of aggression.
Gestures are specific motions that one makes with the hands in order to further communicate a message. Certain gestures such as pointing gestures, can help direct people's focus to what is important that is going on around them. Not only does using gestures help the speaker to better process what they are saying, but it also helps whoever is listening to that person to better comprehend what the speaker is saying.
Mechanisms [ edit ]
Recent work done in the field studying social cues has found that perception of social cues is best defined as the combination of multiple cues and processing streams, also referred to as cue integration. Stimuli are processed through experience sharing and mentalizing and the likelihood of the other person’s internal state is inferred by the Bayesian logic. Experience sharing is a person's tendency to take on another person's facial expressions, posture and internal state and is often related to the area of empathy. A stimulus that is perceptually salient can cause a person to automatically use a bottom-up approach or cognitive top-down intentions or goals. This causes one to move in a controlled and calculated manner. A peripheral cue is used to measure spatial cuing, which does not give away information about a targets location. Naturally, only the most relevant contextual cues are processed and this occurs extremely fast (approx. 100-200 milliseconds). This type of fast, automating processing is often referred to as intuition and allows us to integrate complex multi-dimensional cues and generate suitable behaviour in real time.
Cognitive learning models illustrate how people connect cues with certain outcomes or responses. Learning can strengthen associations between predictive cues and outcomes and weaken the link between nondescriptive cues and outcomes. Two aspects of the EXIT model learning phenomena have been focused on by Collins et al. The first is blocking which happens when a new cue is introduced with a cue that already has meaning. The second is highlighting which happens when an individual pays close attention to a cue that will change the meaning of a cue that they already know. When a new cue is added along with a previous one it is said that individuals only focus on the new cue to gain a better understanding as to what is going on.
Brain regions involved in processing [ edit ]
Benjamin Straube, Antonia Green, Andreas Jansen, Anjan Chatterjee, and Tilo Kircher found that social cues influence the neural processing of speech-gesture utterances. Past studies have focused on mentalizing as being a part of perception of social cues and it is believed that this process relies on the neural system, which consists of:
When people focus on things in a social context, the medial prefrontal cortex and precuneus areas of the brain are activated, however when people focus on a non-social context there is no activation of these areas. Straube et al. hypothesized that the areas of the brain involved in mental processes were mainly responsible for social cue processing. It is believed that when iconic gestures are involved, the left temporal and occipital regions would be activated and when emblematic gestures were involved the temporal poles would be activated. When it came to abstract speech and gestures, the left frontal gyrus would be activated according to Straube et al. After conducting an experiment on how body position, speech and gestures affected activation in different areas of the brain Straube et al. came to the following conclusions:
- when a person is facing someone head on the occipital, inferior frontal, medial frontal, right anterior temporal and left hemispheric parietal cortex were activated
- when participants watched an actor who was delivering a speech talking about another person an extended network of bilateral temporal and frontal regions were activated
- when participants watched an actor who talked about objects and made iconic gestures the occipito-temporal and parietal brain areas were activated. The conclusion that Straube et al. reached was that speech-gesture information is effected by context-dependent social cues.
The amygdala, fusiform gyrus, insula, and superior and middle temporal regions have been identified as areas in the brain that play a role in visual emotional cues. It was found that there was greater activation in the bilateral anterior superior temporal gyrus and bilateral fusiform gyrus when it came to emotional stimuli. The amygdala has been connected with the automatic evaluation of threat, facial valence information, and trustworthiness of faces.
When it comes to visual cues, individuals follow the gaze of others to find out what they are looking at. It has been found that this response is evolutionarily adaptive due to the fact that it can alert others to happenings in the environment. Almost 50% of the time, peripheral cues have a hard time finding the location of a target. Studies have shown that directed gaze impacts attentional orienting in a seemingly automatic manner. Part of the brain that is involved when another person averts their gaze is also a part of attentional orienting. Past researchers have found that arrow cues are linked to the fronto-parietal areas, whereas arrow and gaze cues were linked to occipito-temporal areas. Therefore, gaze cues may indeed rely on automatic processes more than arrow cues. The importance of eye gaze has increased in importance throughout the evolutionary time period.
Higher level visual regions, such as the fusiform gyrus, extrastriate cortex and superior temporal sulcus (STS) are the areas of the brain that studies have found that link to perceptual processing of social/biological stimuli. Data collected from behavioral studies have found that the right hemisphere is highly connected with the processing of left visual field advantage for face and gaze stimuli. Researchers believe the right STS is also involved in using gaze to understand the intentions of others. While looking at social and nonsocial cues, it has been found that a high level of activity has been found in the bilateral extrastriate cortices in regards to gaze cues versus peripheral cues. There was a study done on two people with split-brain, in order to study each hemisphere to see what their involvement is in gaze cuing. Results suggest that gaze cues show a strong effect with the facial recognition hemisphere of the brain, compared to nonsocial cues. The results of Greene and Zaidel's study suggest that in relation to visual fields, information is processed independently and that the right hemisphere shows greater orienting.
Pertaining to emotional expression the superior temporal cortex has been shown to be active during studies focusing on facial perception. However, when it comes to face identity the inferior temporal and fusiform cortex is active. During facial processing the amygdala and fusiform gyrus show a strong functional connection. Face identification can be impaired if there is damage to the orbitofrontal cortex (OFC). The amygdala is active during facial expressions and it improves long-term memory for long term emotional stimuli. It has also been found that there are face response neurons in the amygdala. The connection between the amygdala, OFC, and other medial temporal lobe structures  suggest that they play an important role in working memory for social cues. Systems which are critical in perceptually identifying and processing emotion and identity need to cooperate in order to maintain maintenance of social cues.
In order to monitor changing facial expressions of individuals, the hippocampus and orbitofrontal cortex may be a crucial part in guiding critical real-world social behavior in social gatherings. The hippocampus may well be a part of using social cues to understand numerous appearances of the same person over short delay periods. The orbitofrontal cortex being important in the processing of social cues leads researchers to believe that it works with the hippocampus to create, maintain, and retrieve corresponding representations of the same individual seen with multiple facial expressions in working memory. After coming across the same person multiple times with different social cues, the right lateral orbitofrontal cortex and hippocampus are more strongly employed and display a stronger functional connection when disambiguating each encounter with that individual. During an fMRI scan the lateral orbitofrontal cortex, hippocampus, fusiform gyrus bilaterally showed activation after meeting the same person again and having previously seen two different social cues. This would suggest that both of these brain areas help retrieve correct information about a person's last encounter with the person. The ability to separate the different encounters with different people seen with different social cues leads researchers to believe that it permits for suitable social interactions. Ross, LoPresti and Schon offer that the orbitofrontal cortex and hippocampus are a part of both working memory and long-term memory, which permits flexibility in encoding separate representations of an individual in the varying social contexts in which we encounter them.
Oxytocin has been named "the social hormone". Research done on rats provide strong evidence that social contact enhances oxytocin levels in the brain which then sets the stage for social bonds. In recent years it has been found that inhaling oxytocin through the nasal passage increases trust toward strangers and increases a person's ability to perceive social cues. Activation of face-induced amygdala was found to be increased by oxytocin in women. There have been findings that oxytocin increases occurrence of attention shifts to the eye region of a face which suggests that it alters the readiness of the brain to socially meaningful stimuli. Dopamine neuron from the ventral tegmental area codes the salience of social as well as nonsocial stimuli. Bartz and his colleagues found that the effects of oxytocin is person-dependent, meaning that every individual will be affected differently, especially those who have trouble in social situations. Research done by Groppe and colleagues supports that motivational salience of social cues is enhanced by oxytocin. Oxytocin has been found to increase cues that are socially relevant.
[ edit ]
From a young age people are taught to use social cues of others to gain insight on about the world around them. There is also evidence that reliance on social cues is a naturally occurring tendency.
Research has found that from birth, babies prefer infant directed speech over adult directed speech. As young as 6 months old, babies prefer someone that has previously talked to them and who speaks their native language, over someone who speaks a foreign language. According to Guellai and Steri, at 9-weeks-old, babies fixate more on an adult's eye region when they are talking to them, than when they are silent and looking at them. Guellai and Steri concluded that at birth, babies are able to read two forms of social cues which are eye gaze and voice.
Children use social cues such as eye gaze and/or engaging facial expressions to understand adults' intentions while using different signs and symbols. Leekam, Soloman and Teoh hypothesized that children would pay more attention to a task if the adult had an engaging facial expression. They tested their hypothesis on 2 and 3 year olds using three signs: a pointer finger, a replica and an arrow. After their first experiment their hypothesis was supported. They found that young children understood the reasons behind the symbol or sign with the presence of an engaging face. However, when no face was visible performance significantly declined. Leekam, Soloman, and Teoh state that it is understandable that children understood the significance of the pointing sign due to their familiarity with it; children can comprehend the reference of pointing as early as 12 months. The researchers came to the conclusion that it is easier for children to identify an action carried out by a bare hand, as early as 7 months, as opposed to understanding the intent behind an action of a gloved hand. An important social cue that helps children when it comes to the function of a sign or symbol is that of an engaging face. During the difficult and unfamiliar tasks of the study, children looked for social cues more.
According to studies on social referencing, infants use the emotional cues of others to guide their behavior. Vocal cues are seen as more effective because infants are used to vocal-only cues from their parents. This was shown in a visual cliff study conducted by Vaish and Striano where infants were left on the shallow end of a plexy glass cliff and mothers were on the other end. The mothers either used facial and vocal cues, facial cues only or vocal cues only to beckon their child forward. The study showed that infants crossed over faster in response to vocal-only cues than facial-only cues. It is believed that the reason infants do this is that they are accustomed to vocal-only cues from their parents.
In past studies, it has been found that infants use social cues to help them learn new words especially when there are multiple objects present. Most studies have used two or more objects are used simultaneously to see if infants could learn if they are paying attention to cues presented. At 14 months old infants followed an adult's gaze to an object indicating that they believe that the eyes are important for looking. Head turning and gaze are other gazes that infants view as referential cues. Around 18 months social cues become beneficial to infants, though they are not always useful. Young infants rely on attentional cues while older infants rely more on social cues to help them learn things. However, it was found that 12 month old infants could not use cues such as, eye gaze, touching, and handling, to learn labels. Research shows that 15-month-old infants are sensitive to gaze direction directed by adults and are able to correctly use these cues to help with referent novel words.
Even as toddlers, we gain social cues from others and determine how we should behave based upon these cues we receive from adults. Smith and LaFreniere mention RAI (recursive awareness of intentionality), which is the understanding of how the cues one provides will influence the beliefs and actions of those receiving them. RAI is absent in children under the age of 5, but develops during middle childhood. They tested to see if children ages 4, 6, and 8 were able to read the intentions of their partner in a game through both nonverbal hints and facial expressions. They found that 8-year-olds were better able to read their partner's cues and based their decisions off those cues.
In school [ edit ]
In the classroom there is a development of cues between the teacher and student. It seems that classrooms develop their own way of talking and communicating information. Once a set of verbal and nonverbal behaviors takes place in a classroom on a consistent basis, it becomes a norm/set of rules within the classroom. The following cues are nonverbal indications that give way to certain norms in the classroom:
Teachers and students develop a way of understanding the way each other thinks, believes, acts and perceives things. A teacher can use the gaze of their eyes and the position of their body to indicate where the student's attention should be held. Sometimes if students are stuck in a previous discussion or cannot determine an appropriate response to the current topic, it could mean that they did not perceive the cues that the teacher was displaying correctly. Both students and teachers must read the cues to gather what is currently going on, how they are supposed to be doing something and the reason behind what they were doing.
Impairments in psychological disorders [ edit ]
Accurately interpreting social cues is a vital part of normal social function. However, individuals with certain psychological disorders, including schizophrenia social anxiety disorder, and ADHD, tend to suffer from difficulties in interpreting and using these cues.
Schizophrenia [ edit ]
According to the Diagnostic and Statistical Manual of Mental Disorders (DSM), schizophrenia is a psychological disorder, which has to include two out of the five symptoms listed below:
- disorganized speech
- grossly disorganized or catatonic behavior
- negative symptoms: affective flattening, alogia, or avolition
Schizophrenic people find it hard to pick up on social cues. More specifically, people with schizophrenia are found to have deficits in emotional facial recognition, social knowledge, empathy, and non verbal cues, and emotional processing. Most of these aspects are part of a category called social cognition. However, most tasks that are related to social cognition involve emotional processing, empathy, and social norms knowledge. When dealing with facial expression recognition, recent research has found that people with this disorder are unable to recognize facial expressions that exhibit negative emotions, including fear, sadness, anger, and disgust. As a result, Schizophrenic people have trouble comprehending situations that involve different types of empathy, especially situations that require empathy for pain.
In addition, research has found that those with schizophrenia are more likely to make additional false positives when aspects of the task are more abstract. A false positive is when a participant mistakenly believes that they observed a specific social cue in the vignette shown to them. Therefore, the social cue that they believe they saw happening in the video was nonexistent. In order to see whether someone is able to correctly identify both types of cues, researchers use the Social Cue Recognition Test, also known as SCRT. When the task is defined as being too abstract, this means that it contains abstract cues, which are cues that can be inferred from a social setting. This would consist of actions and situations that contain; affect, goals, and rules. Thus, people with Schizophrenia have trouble making inferences about social situations and settings that deal with abstract aspects. On the other hand, schizophrenic people are better at identifying features that use concrete cues, which are cues that can be observed directly. The reason for this is because concrete clues are more apparent while abstract cues are more ambiguous.
Autism [ edit ]
Individuals with autism have trouble reading social cues correctly. Misreading social cues can lead to a person acting out, which can then result in negative interactions and social disapproval. Therefore, social cues are believed to be an important aspect of inclusion and comfort in personal, interpersonal and social environments.
The DSM states that Autism is a psychological disorder that has multiple symptoms that fall within three separate symptomatic categories.
- Impairment in social interactions such as:
- impairment in nonverbal behaviors that are used in normal social interactions: eye to eye gaze, facial expressions, posture, and gestures
- failure to make appropriate peer relationships
- lack of sharing interest, enjoyment, and or achievements with others.
- lack of emotional and social reciprocity
- Impairments in communication in one of the following:
- a delay in or lack of the development of spoken language
- impairment in ability to start and keep a conversation (only pertains to those who are sufficient in speech)
- language that is stereotyped and repetitive
- lack of diverse and spontaneous make-believe play (pretend), or imitative play that is appropriate for their age and developmental level
- Restricted repetitive and stereotyped models of behavior, interests, and activities in at least one of the following:
- fixation with one or more stereotyped and restricted models of interest that is atypical in either focus or intensity
- evidently inflexible devotion to particular, nonfunctional routines or rituals
- stereotyped and continuous motor mannerisms (ex. hand or finger twisting, or whole body movements)
- constant fixation with parts of objects 
The main social cue impairments of those on the Autistic Spectrum include; interpreting facial expressions, understanding body language, and being able to decipher gaze direction. All three of these cues are classified under the nonverbal communication category. However, prior research has found that Autistic children and Autistic adults have no difficulty in identifying human bodily movements and or body language that is used in everyday and/or normal activities. The aspect that Autistic people have trouble with is more so the ability that is needed to verbally describe the emotions that are connected with these types of bodily movements.
Children who are not Autistic learn to relate the body movements that they see with the emotions and mental states of others when they have face to face interactions with other children. Having face to face interactions with other people, helps children increase their knowledge of what these body movements represent. After seeing these representations being used multiple times, children are then able to make inferences about the representations and the people making them. This means that the children will be able to make an assumption about a person that they interact with in the future since they already understand what the body movements and or body language represents.
Social anxiety [ edit ]
Social anxiety disorder, also known as social phobia, is a disorder that the DSM identifies as someone who experiences some of the following:
- persistent fear of one or more social or performance related situations in which the person with the fear is exposed to people that are unfamiliar
- constantly fears that he or she will be humiliated, embarrassed, and or criticized by those unfamiliar people
- when exposed to the feared situation, the person exhibits anxiety that may take the form of a panic attack, in children this could be crying out or a tantrum
- avoids the feared situation at all costs
- the avoidance of the feared situation causes one to be in distress which causes a significant interference with the persons' normal routine, relationships, and functioning in either school or work
People with Social Anxiety Disorder are found to be overly concerned with the disapproval and approval of others around them. Due to this obsession with what others think of them, people with this disorder tend to interact with either a few or no people at all. As a result, they do not get an appropriate amount of social interaction, which contributes to their deficit in interpreting emotions and facial expressions. More specifically, people with Social Anxiety disorder tend to have a negative bias towards both facial expressions and emotions, which leads them to interpret such cues that are normal and or happy as being negative. Previous research has found that because people with this disorder tend to have a negative bias towards social cues, they take longer to process and comprehend social cues that represent happiness.
ADHD [ edit ]
It has been found that children who have both ADHD and a learning disability also have trouble comprehending social cues, have poor social skills, have difficulty creating and/or maintaining friendships, and have trouble reacting to other people's thoughts and feelings. However, part of the reason that children with ADHD have deficits in the social realm is their lack of focus and self control, which obstructs their ability to properly interpret social cues.
More specifically, people with ADHD tend to focus on too many cues, which disables them from interpreting which cues are more important. Because of this, certain social situations are especially hard for people with ADHD to interpret. One situation that would meet this criteria would be when someone is being deceptive towards them. The reason a deceptive situation would be harder for some with this disorder is because the social cues one gives off when being deceptive are very subtle. Therefore, since people with ADHD already have trouble interpreting social cues, subtle social cues would be even more difficult for someone with this disorder to comprehend and or interpret.
However, many studies have found that people with ADHD that take stimulants and/or prescribed medication for ADHD, are better able to interpret which social cues are of the most importance. As a result, they are better at interacting and communicating with others, which then enables them to make and maintain better friendships or relationships.
In Internet communication [ edit ]
Communication on the Internet is very different from communication with others face to face. McKenna and Bargh identified four main differences between face to face communication and communication that takes place on the internet.
These four differences include:
- physical distance
- physical appearance
Anonymity is a major feature that internet communication can provide. Not only are you not able to see the person's face that you are emailing and or communicating with, but they are also not able to see your face. This can be a very positive feature for those that are socially anxious and or have a social anxiety disorder because it eliminates the idea of being publicly humiliated and or embarrassed, which is something that most people who are socially anxious are very worried about. As a result, people with social anxiety are more inclined to open up, which allows them to get closer and form more relationships with others.
On the other hand, being anonymous can cause deindividuation, which is when one is no longer an individual, but rather just seen as being part of a group. In other words, one can feel like they are just one person among a thousand others and because of this, they are not as noticeable. This has found to cause some people to behave more impulsively and have less self monitoring. This type of behavior and thinking can cause one to be more blunt and aggressive towards the people that they are communicating with. However, the blunt and aggressive responses can also be due to the fact that the person is not communicating with the other person face to face. However, others have suggested that whether or not the reduced availability of social cues results in negative behavior may depend upon the situation and the individuals goals.
Unlike with face-to-face communication, physical distance and/or proximity are not barriers to communicating on the Internet. The Internet allows people from all over the world to come together and interact with each other. No matter what city or country one lives in, they are able to communicate and or interact with anyone around the world who is on the Internet. As a result, people are able to make friends and communicate with others that they normally would not have been able to interact with due to physical distance. Furthermore, people are able to communicate and stay in contact with their family and friends that may live to far away to visit on a regular basis.
Similar to Physical distance, time is a feature that does not matter when one is communicating on the internet. For instance, people are able to communicate with others if the person they are communicating with is not online at that moment. One way to do this is through the process of email. By communicating through email, one is able to send another person a message at any time. This also allows one to think about what they would like to say and edit their response before sending it. Furthermore, when one gets the said email, they do not have to respond right away. This means that there is also no time constraint on when one must respond.
Along with proximity and time, physical appearance is another factor about the internet that is of no importance. Like previously mentioned in the anonymity paragraph, people are unable to see the physical characteristics of the person or persons that they are interacting with on the internet. This allows people to talk to others that they would normally not talk to if they had actually seen the person face to face. As a result, people are able to connect on a more meaningful level and are able to create closer relationships that are not just about physical attraction. This is also considered to be a very positive aspect about the internet.
A positive feature of the internet is that it has millions of different chat rooms and blogs that allow people to communicate with others who share their same interests and values. Not only does this enable people to find others who are similar to them, but it also allows people to find emotional support. However, there also a few negative consequences of the ability to connect with like-minded others online. One negative feature is that it allows people to come together and talk about subjects such as murder, and hate groups.
The absence of certain social cues online can lead to more misunderstandings than if one were communicating face to face. The reason that this can happen more easily is because when reading an email, people are not able to hear the other person's voice or see the person's facial expression. Both the voice and facial expressions are very important social cues that allow others to understand how someone else is feeling and without them, one can misinterpret what someone is saying and or wrote in an email.
References [ edit ]
- Adams, Reginald B.; Albohn, Daniel N.; Kveraga, Kestutis (June 2017). "Social Vision: Applying a Social-Functional Approach to Face and Expression Perception". Current Directions in Psychological Science. 26 (3): 243–248. doi:10.1177/0963721417706392. ISSN 0963-7214. PMC 5873322. PMID 29606807.
- Freeth, Megan; Foulsham, Tom; Kingstone, Alan (2013-01-09). "What Affects Social Attention? Social Presence, Eye Contact and Autistic Traits". PLOS ONE. 8 (1): e53286. Bibcode:2013PLoSO...853286F. doi:10.1371/journal.pone.0053286. ISSN 1932-6203. PMC 3541232. PMID 23326407.
- Pickett, Cynthia; Gardner, W. L.; Knowles, M. (2004). "Getting a Cue: The Need to Belong and Enhanced Sensitivity to Social Cues". Personality and Social Psychology Bulletin. 30 (9): 1095–1107. doi:10.1177/0146167203262085. PMID 15359014.
- Woodward, James; Allman, John (July 2007). "Moral intuition: Its neural substrates and normative significance". Journal of Physiology-Paris. 101 (4–6): 179–202. doi:10.1016/j.jphysparis.2007.12.003. ISSN 0928-4257. PMID 18280713.
- Xin, Ziqiang; Liu, Youhui; Yang, Zhixu; Zhang, Hongchuan (2016-06-29). "Effects of minimal social cues on trust in the investment game". Asian Journal of Social Psychology. 19 (3): 235–243. doi:10.1111/ajsp.12143. ISSN 1367-2223.
- Frith, Uta; Frith, Christopher (2003). "Development and neurophysiology of mentalizing". Philos Trans R Soc Lond B Biol Sci. 358 (1431): 459–473. doi:10.1098/rstb.2002.1218. PMC 1693139. PMID 12689373.
- Macdonald, R.G.; Tatler, B.W. (2013). "Do as eye say: Gaze cueing and language in a real-world social interaction". Journal of Vision. 13 (4): 1–12. doi:10.1167/13.4.6. PMID 23479476.
- Neuhaus, E.; Beauchaine, T.P.; Bernier, R. (2010). "Neurobiological correlatates of social functioning in autism". Clinical Psychology Review. 30 (6): 733–748. doi:10.1016/j.cpr.2010.05.007. PMID 20570622.
- Phillips, Louise H.; Bull, Rebecca; Allen, Roy; Insch, Pauline; Burr, Kirsty; Ogg, Will (2011). "Lifespan aging and belief reasoning: Influences of executive function and social cue decoding". Cognition. 120 (2): 236–247. doi:10.1016/j.cognition.2011.05.003. ISSN 0010-0277. PMID 21624567.
- Sheth, B. R.; Liu, J.; Olagbaju, O.; Varghese, L.; Mansour, R.; Reddoch, S.; Pearson, D.; Loveland, K. (2011). "Detecting Social and Non-Social Changes in Natural Scenes: Performance of Children with and Without Autism Spectrum Disorders and Typical Adults". Journal of Autism and Developmental Disorders. 41 (4): 434–446. doi:10.1007/s10803-010-1062-3. PMID 20614172.
- Ekman, P. (1993). "Facial expression and emotion". American Psychologist. 48 (4): 384–392. doi:10.1037/0003-066X.48.4.384. PMID 8512154.
- Carrol, James; Russell, J. (1996). "Do Facial Expressions Signal Specific Emotions? Judging Emotion From Fact the Face in Context". Journal of Personality and Social Psychology. 70 (2): 205–218. doi:10.1037/0022-35126.96.36.199.
- Bryan, Roonie; Pietro Perona; Ralph Adolphs (2012). "Perspective Distortion from Interpersonal Distance is an Implicit Visual Cue for Social Judgments of Faces". PLOS ONE. 7 (9): e45301. Bibcode:2012PLoSO...745301B. doi:10.1371/journal.pone.0045301. PMC 3448657. PMID 23028918.
- Haxby, James; Gobbini, M.I. (2007). "The perception of emotion and social cues in faces". Neuropsychologia. 45 (1): 1. doi:10.1016/j.neuropsychologia.2006.11.001. PMID 17109900. n
- Zebrowitz, Leslie A.; Montepare, Joann M. (May 2008). "Social Psychological Face Perception: Why Appearance Matters". Social and Personality Psychology Compass. 2 (3): 1497–1517. doi:10.1111/j.1751-9004.2008.00109.x. ISSN 1751-9004. PMC 2811283. PMID 20107613.
- Frischen, Alexandra; Bayliss, Andrew P.; Tipper, Steven P. (July 2007). "Gaze cueing of attention: Visual attention, social cognition, and individual differences". Psychological Bulletin. 133 (4): 694–724. doi:10.1037/0033-2909.133.4.694. ISSN 1939-1455. PMC 1950440. PMID 17592962.
- Freeth, M.; Foulsham, T.; Kingstone, A. (2013). "What affects social attention? Social presence, eye contact and autistic traits". PLOS ONE. 8 (1): e53286. Bibcode:2013PLoSO...853286F. doi:10.1371/journal.pone.0053286. PMC 3541232. PMID 23326407.
- Centelles, Laurie; Assaiante, C.; Etchegoyhen, K.; Bouvard, M.; Schmitz, C. (May 2013). "From Action to Interaction: Exploring the Contribution of Body Motion Cues to Social Understanding in Typical Development and in Austim Spectrum Disorders". Journal of Autism and Developmental Disorders. 43 (5): 1140–1150. doi:10.1007/s10803-012-1655-0. PMID 23008056.
- Koppensteiner, Markus (November 2013). "Motion cues that make an impression". Journal of Experimental Social Psychology. 49 (6): 1137–1143. doi:10.1016/j.jesp.2013.08.002. ISSN 0022-1031. PMC 3819996. PMID 24223432.
- Langton, Stephen; Bruce, V. (2000). "You Must See the Point: Automatic Processing of Cues to the Direction of Social Attention" (PDF). Journal of Experimental Psychology. 26 (2): 747–757. doi:10.1037/0096-15188.8.131.527. hdl:1893/21050.
- Zaki, Jamil (2013). "Cue Integration: A Common Framework for Social Cognition and Physical Perception". Perspectives on Psychological Science. U.S. Sage Publications. 8 (3): 296–312. doi:10.1177/1745691613475454. PMID 26172972.
- Greene, Deanna; Zaidel, Eran (2010). "Hemispheric differences in attentional orienting by social cues". Neuropsychologia. 49 (1): 61–68. doi:10.1016/j.neuropsychologia.2010.11.007. PMID 21093465.
- Collins, E.C.; Percy, E.J.; Smith, E.R.; Kruschke, J.K. (2011). "Integrating Advice and Experience: Learning and Decision Making With Social and Nonsocial Cues". Journal of Personality and Social Psychology. 100 (6): 967–982. doi:10.1037/a0022982. PMID 21443371.
- Straube, Benjamin; Green, Antonia; Jansen, Andreas; Chatterjee, Anjan; Kircher, Tilo (2010). "Social cues, mentalizing and the neural processing of speech accompanied by gestures". Neuropsychologia. 48 (2): 382–393. doi:10.1016/j.neuropsychologia.2009.09.025. PMID 19782696.
- Robins, Diana; Hunyadi, E.; Schultz, R. T. (2009). "Superior temporal activation in response to dynamic audio-visual emotional cues". Brain and Cognition. 69 (2): 269–278. doi:10.1016/j.bandc.2008.08.007. PMC 2677198. PMID 18809234.
- Friesen, C.K.; Moore, C.; Kingstone, A. (2005). "Does gaze direction really trigger a reflexive shift of spatial attention?". Brain and Cognition. 57 (1): 66–69. doi:10.1016/j.bandc.2004.08.025. PMID 15629217.
- Driver, J; Davis, G.; Ricciardelli, P.; Kidd, P.; Maxwell, E.; Baron-Cohen, S. (1999). "Gaze perception triggers reflexive visuospatial orienting". Visual Cognition. 6 (5): 509–540. CiteSeerX 10.1.1.212.3316. doi:10.1080/135062899394920.
- Langton, S.R.H (2000). "The mutual influence of gaze and head orientation in the analysis of social attention direction". The Quarterly Journal of Experimental Psychology. 53 (3): 825–845. doi:10.1080/713755908. hdl:1893/21047. PMID 10994231.
- Hietanen, J.K.; Nummenmaa, L.; Nyman, M.J.; Parkkola, R.; Hamalainen, H. (2006). "Automatic attention orienting by social and symbolic cues activates different neural networks: an fMRI study". NeuroImage. 33 (1): 406–413. doi:10.1016/j.neuroimage.2006.06.048. PMID 16949306.
- Greene, D.J.; Mooshagian, E.; Kaplan, J.T.; Zaidel, E.; Iacoboni, M. (2009). "The neural correlates of social attention: automatic orienting to social and nonsocial cues". Psychological Research. 73 (4): 499–511. doi:10.1007/s00426-009-0233-3. PMC 2694932. PMID 19350270.
- Downing, P.E.; Jiang, Y.H.; Shuman, M.; Kanwisher, N. (2001). "A cortical area selective for visual processing of the human body". Science. 293 (5539): 2470–2473. Bibcode:2001Sci...293.2470D. CiteSeerX 10.1.1.70.6526. doi:10.1126/science.1063414. PMID 11577239.
- Ricciardelli, P; Ro, T.; Driver, J. (2002). "A left visual field advantage in perception of gaze direction". Neuropsychologia. 40 (7): 769–77. CiteSeerX 10.1.1.494.3962. doi:10.1016/s0028-3932(01)00190-7. PMID 11900727.
- Pelphrey, K.A.; Singerman, J.D.; Allison, T.; McCarthy, G. (2003). "Brain activation evoked by perception of gaze shifts: the influence of context". Neuropsychologia. 41 (2): 156–170. doi:10.1016/S0028-3932(02)00146-X. PMID 12459214.
- Tipper, C.M.; Handy, T.C.; Giesbrecht, B.; Kingstone, A. (2008). "Brain responses to biological relevance". Cognitive Neuroscience. 20 (5): 879–891. doi:10.1162/jocn.2008.20510. PMID 18201123.
- Kingstone, A.; Friesen, C.; Gazzaniga, M. (2000). "Reflexive joint attention depends on lateralized cortical connections". Psychological Science. 11 (2): 159–166. doi:10.1111/1467-9280.00232. PMID 11273424.
- Winston, J.S.; Henson, R.N.A.; Fine-Goulden, M.R.; Dolan, R.J. (2004). "fMRI-adaptation reveals dissociable neural representations of identity and expression in face perception". Journal of Neurophysiology. 92 (3): 1830–1839. doi:10.1152/jn.00155.2004. hdl:21.11116/0000-0001-9F84-7. PMID 15115795.
- Hornak, J.; Bramham, J.; Rolls, E.T.; et al. (2003). "Changes in emotion after circumscribed surgical lesions of the orbitofrontal and cingulate cortices". Brain. 126 (7): 1691–1712. doi:10.1093/brain/awg168. PMID 12805109.
- Kensinger, E.A.; Schacter, D.L. (2006). "Amygdala activity is associated with the successful encoding of item, but not source, information for positive and negative stimuli". Journal of Neuroscience. 26 (9): 2564–2570. doi:10.1523/jneurosci.5241-05.2006. PMC 6793660. PMID 16510734.
- Leonard, C.M.; Rolls, E.T.; Wilson, F.A.W.; Baylis, G.C. (1985). "Neurons in the amygdala of the monkey with responses selective for faces". Behavioural Brain Research. 15 (2): 159–176. CiteSeerX 10.1.1.64.2871. doi:10.1016/0166-4328(85)90062-2. PMID 3994832.
- Furtak, S.C.; Wei, S.M.; Agster, K.L.; Burwell, R.D. (2007). "Functional neuroanatomy of the parahippocampal region in the rat: the perirhinal and postrhinal cortices". Hippocampus. 17 (9): 709–722. doi:10.1002/hipo.20314. PMID 17604355.
- LoPresti, M.L.; Schon, K.; Tricarico, M.D.; Swisher, J.D.; Celone, K.A.; Stern, C.E. (2008). "Working Memory for Social Cues Recruits Orbitofrontal Cortex and Amygdala: A Functional Magnetic Resonance Imaging Study of Delayed Matching to Sample for Emotional Expressions". The Journal of Neuroscience. 28 (14): 3718–3728. doi:10.1523/jneurosci.0464-08.2008. PMC 2748754. PMID 18385330.
- Ross, R.S.; LoPresti, M.L.; Schon, K. (2013). "Role of the hippocampus and orbitofrontal cortex during the disambiguation of social cues in working memory". Cognitive, Affective, & Behavioral Neuroscience. 13 (4): 900–15. doi:10.3758/s13415-013-0170-x. PMC 3796192. PMID 23640112.
- Averbeck, B.B. (2010). "Oxytocin and the salience of social cues". Proceedings of the National Academy of Sciences of the United States of America. 107 (20): 9033–9034. Bibcode:2010PNAS..107.9033A. doi:10.1073/pnas.1004892107. PMC 2889126. PMID 20448196.
- Bartz, J.A.; Zaki, J.; Bolger, N.; Ochsner, K.N. (2011). "Social effects of oxytocin in humans: Context and person matter". Trends in Cognitive Sciences. 15 (7): 301–309. doi:10.1016/j.tics.2011.05.002. PMID 21696997.
- Domes, G.; Lischke, A.; Berger, C.; et al. (2010). "Effects of intranasal oxytocin on emotional face processing in women". Psychoneuroendocrinology. 35 (1): 83–93. doi:10.1016/j.psyneuen.2009.06.016. PMID 19632787.
- Gamer, M.; Zurowski, B.; Buchel, C. (2010). "Different amygdala subregions mediate valence-related and attentional effects of oxytocin in humans". Proceedings of the National Academy of Sciences. 107 (20): 9400–9405. doi:10.1073/pnas.1000985107. PMC 2889107. PMID 20421469.
- Modi, M.E.; Young, L.J. (2012). "The oxytocin system in drug discovery for autism: animal models and novel therapeutic strategies". Hormones and Behavior. 61 (3): 340–50. doi:10.1016/j.yhbeh.2011.12.010. PMC 3483080. PMID 22206823.
- Groppe, S.E.; Gossen, A.; Rademacher, L.; Hahn, A.; Westphal, L.; Grunder, G.; Spreckelmeyer, K.N. (2013). "Oxytocin Influences Processing of Socially Relevant Cues in the Ventral Tegmental Area of the Human Brain". Biological Psychiatry. 74 (3): 172–179. doi:10.1016/j.biopsych.2012.12.023. PMID 23419544.
- Guellai and Steri, Bahia and Arlette (2011). "Cues for Early Social Skills: Direct Gaze Modulates Newborns' Recognition of Talking Faces". PLOS ONE. U.S. Public Library of Science. 6 (4): e18610. Bibcode:2011PLoSO...618610G. doi:10.1371/journal.pone.0018610. PMC 3078105. PMID 21525972.
- Leekam, Susan; Tracy Solomon; Yee-San Teoh. "Adult's social cues facilitate young children's use of signs and symbols". Developmental Science.
- Camras, L.A.; Sachs, V.B. (1991). "Social referencing and caretaker expressive behavior in a day care setting". Infant Behavior and Development. 14: 27–36. doi:10.1016/0163-6383(91)90052-t.
- Vaish, Amrisha; Striano, Tricia (2004). "Is visual reference necessary? Contributions of facial versus vocal cues in 12-month-olds' social referencing behavior". Developmental Science. 7 (3): 261–269. doi:10.1111/j.1467-7687.2004.00344.x. PMID 15595366.
- Baldwin, D.A.; Meltzoff, L.J. (2001). "Links between social understanding and early word learning: Challenges to current accounts". Social Development. 10 (3): 309–329. doi:10.1111/1467-9507.00168.
- Briganti, Alicia M.; Cohen, L.B. (2011). "Examining the role of social cues in early word learning". Infant Behavior and Development. 34 (1): 211–214. doi:10.1016/j.infbeh.2010.12.012. hdl:2152/41670. PMID 21236493.
- Hollich, G.J.; Hirish-Pasek, K.; Golinkoff, R.M.; Brand, R.J.; Brown, E.; Chung, H.L.; et al. (2000). "Breaking the language barrier: An emergentist coalition model for the origins of word learning". Monographs of the Society for Research in Child Development. 65 (3): i–vi, 1–123. doi:10.1111/1540-5834.00091. PMID 12467096.
- Smith, Rachelle; Peter LaFreniere (2009). "Development of Children's ability to infer intentions from nonverbal cues". The Journal of Social, Evolutionary, and Cultural Psychology. 3 (4): 315–327. doi:10.1037/h0099313.
- Green, Judith; Weade, Regina (1985). "Reading Between the Words: Social Cues to Lesson Participation". Theory into Practice. 24: 14–21. doi:10.1080/00405848509543141.
- American Psychiatric Association. (2000). Diagnostic and statistical manual of mental disorders (4th ed., text rev.). American Psychiatric Publishing Inc. 2000.
- "Social Cues Are Difficult for People with Schizophrenia". 25 May 2011.
- Baez, Sandra (8 March 2013). "Contexual Social Cognition Impairments in Schizophrenia and Bipolar Disorder". PLOS ONE. 8 (3): e57664. Bibcode:2013PLoSO...857664B. doi:10.1371/journal.pone.0057664. PMC 3592887. PMID 23520477.
- Corrigan, Patrick; Denise Nelson (23 January 1998). "Factors that Affect Social Cue Recognition in Schizophrenia". Psychiatry Research. 78 (3): 189–196. doi:10.1016/s0165-1781(98)00013-4. PMID 9657423.
- "Acting Out Behavior - Why Misreading Social Cues Leads to Behavioral Problems". Empoweringparents.com. 1970-01-01. Retrieved 2013-10-31.
- Epstein, M.A.; Shaywitz, B.A.; Woolston, J.L. (1991). "The Boundaries of Attention Deficit Disorder". Journal of Learning Disabilities. 24 (2): 78–86. doi:10.1177/002221949102400204. PMID 2010678.
- Hall, Cathy; Peterson, A.; Webster, R.; Bolen, L.; Brown, M. (1999). "Perception of Nonverbal Social Cues by Regular Education, ADHD, and ADHD/LD Students". Psychology in the Schools. 36 (6): 505–513. doi:10.1002/(sici)1520-6807(199911)36:6<505::aid-pits6>3.3.co;2-0.
- Peterson, B.; Grahe, J. (2012). "Social Perception and Cue Utilization in Adults with ADHD". Journal of Social and Clinical Psychology. 31 (7): 663–689. doi:10.1521/jscp.2012.31.7.663.
- McKenna, Katelyn; Bargh, J. (2000). "Plan 9 From Cyberspace: The Implications of the Internet for Personality and Social Psychology". Personality and Social Psychology Review. 4 (1): 57–75. doi:10.1207/s15327957pspr0401_6.
- McKenna, K. Y. A.; Green, A. S.; Gleason, M. E. J. (2002). "Relationship formation on the Internet: What's the big attraction?". Journal of Social Issues. 58 (1): 9–31. doi:10.1111/1540-4560.00246.
- Sproull, L.; Kiesler, S. (1985). "Reducing social con-text cues: Electronic mail in organizational communication". Management Science. 32 (11): 1492–1512. doi:10.1287/mnsc.32.11.1492.
- Bargh, J. A.; McKenna, K. Y. A. (2004). "The internet and social life". Annual Review of Psychology. 55: 573–590. CiteSeerX 10.1.1.586.3942. doi:10.1146/annurev.psych.55.090902.141922. PMID 14744227.