Hostname: page-component-745bb68f8f-kw2vx Total loading time: 0 Render date: 2025-02-06T03:45:15.630Z Has data issue: false hasContentIssue false

Defective comprehension of emotional faces and prosody as a result of right hemisphere stroke: Modality versus emotion-type specificity

Published online by Cambridge University Press:  25 October 2006

MICHAL HARCIAREK
Affiliation:
Institute of Psychology, University of Gdansk, Gdansk, Poland
KENNETH M. HEILMAN
Affiliation:
Department of Neurology, University of Florida College of Medicine and VAMC, Gainesville, Florida
KRZYSZTOF JODZIO
Affiliation:
Institute of Psychology, University of Gdansk, Gdansk, Poland
Rights & Permissions [Opens in a new window]

Abstract

Studies of patients with brain damage, as well as studies with normal subjects have revealed that the right hemisphere is important for recognizing emotions expressed by faces and prosody. It is unclear, however, if the knowledge needed to perform recognition of emotional stimuli is organized by modality or by the type of emotion. Thus, the purpose of this study is to assess these alternative a priori hypotheses. The participants of this study were 30 stroke patients with right hemisphere damage (RHD) and 31 normal controls (NC). Subjects were assessed with the Polish adaptation of the Right Hemisphere Language Battery of Bryan and the Facial Affect Recognition Test based on work of Ekman and Friesen. RHD participants were significantly impaired on both emotional tasks. Whereas on the visual-faces task the RHD subjects recognized happiness better than anger or sadness, the reverse dissociation was found in the auditory-prosody test. These results confirm prior studies demonstrating the role of the right hemisphere in understanding facial and prosodic emotional expressions. These results also suggest that the representations needed to recognize these emotional stimuli are organized by modality (prosodic-echoic and facial-eidetic) and that some modality specific features are more impaired than others. (JINS, 2006, 12, 774–781.)

Type
Research Article
Copyright
© 2006 The International Neuropsychological Society

INTRODUCTION

About three decades ago, studies of patients who sustained unilateral hemispheric injuries suggested that the recognition of emotion, as expressed by speech prosody (Bowers et al., 1987; Heilman et al., 1975; Ross, 1981; Tucker et al., 1977) and emotional faces (Bowers et al., 1991; DeKosky et al., 1980) appeared to be mediated by the right hemisphere. Subsequently, multiple studies confirmed these initial observations (Davidson et al., 2004; Heilman et al., 2003; Lane et al., 1997), and functional imaging has provided converging evidence for a dominant role of the right hemisphere in processing emotional faces and prosody (Esslen et al., 2004; Wymer et al., 2002).

Although comprehension deficits can be induced by perceptual deficits, prior investigators have also provided evidence that the comprehension deficits of emotional prosody and faces cannot be fully accounted for by perceptual deficits. For example, Bowers and colleagues (1985) found that subjects with right hemisphere damage (RHD) performed significantly worse than left hemisphere damaged patients (LHD) and normal controls (NC) across a series of seven facial identity and facial affect tasks, and this difference persisted even when the patient groups were statistically equated on a measure of visual-perceptual ability. In addition, Blonder and coworkers (1991) asked RHD and LHD patients, as well as normal controls, to judge the emotional content of sentences describing nonverbal expressions and sentences describing emotional situations. These investigators found that RHD subjects performed normally in their ability to infer the emotions conveyed by prosodically neutral sentences describing situations. The RHD patients, however, were impaired in relation to LHD and NC in their ability to judge the emotional content of sentences depicting emotional facial-gestural and prosodic expressions. These results suggest that patients with these emotional comprehension deficits have a disruption of nonverbal communicative representations (e.g., affective prosodic-echoic and facial-eidetic).

It is not known, however, how these affective echoic and eidetic representations are organized. There are at least three possibilities. One possibility is that these emotions are organized hemispherically by valence, pleasant-positive vs. unpleasant-negative. This valence hypothesis suggests that the right hemisphere is dominant for perceiving-comprehending negative, unpleasant emotions, and the left hemisphere is dominant for positive, pleasant emotions (Reuter-Lorenz & Davidson, 1981). Although there has been some support for the hemisphere valence hypothesis in emotional experience, overall the support for the valence hypothesis in the recognition of emotional faces and prosody has been weak and most investigators have shown that right hemisphere damage induces recognition deficits of emotional faces and prosody independent of valence (Borod, 2000; Heilman et al., 2003; Kucharska-Pietura et al., 2003). The “specific emotion type” hypothesis suggests that within the right hemisphere emotions are organized by type (e.g., anger, fear, happiness, and sadness) (Davidson et al., 1990). The third hypothesis suggests that emotional echoic and eidetic representations are not organized by type but are organized by modality (Adolphs et al., 2002; Habib, 1986). Adolphs and coworkers (2001) studied recognition of visually and auditorily presented emotions in subjects with temporal lobectomy. They found that these patients performed poorly in perception of emotional prosody, whereas their ability to correctly identify facial expressions was more variable. By comparison, Habib (1986) showed that patients with RHD may present with deficits of emotional processing in the visual modality while their ability to process emotional prosody was preserved. Although some of the previously conducted research on emotional perceptual-recognition impairments, as a consequence of damage to the right hemisphere, assessed the comprehension of affective prosody as well as facial affect simultaneously (e.g., Blonder et al., 1991; Borod et al., 1998; Schmitt et al., 1997), most of the studies on this subject limited their assessment to one modality (e.g., Barrett et al., 1999; Bowers et al., 1985; Buck & Duffy, 1980; Heilman et al., 1975; Mandal et al., 1999). Thus, it remains unclear whether deficits in identification of visually and auditory presented emotions in patients with right hemisphere lesions involve all types of emotions equally or whether there is modality specific impairment of emotional recognition. Thus, the purpose of this study is to assess and characterize the impairments of emotional perceptions in patients with right hemisphere cerebral infarctions.

METHODS

Participants

Thirty stroke patients with infarctions limited only to the right hemisphere (RHD) (16 women and 14 men) as well as 31 demographically matched normal control subjects (NC) (17 women and 14 men) were the participants for this study. Both RHD and NC subjects were recruited from the Department of Neurology at the Medical University of Gdansk and from the Hospital of the Gdansk Rehabilitation Center in Dzierzazno. Before testing, informed consent was obtained from each study participant. All RHD subjects underwent a complete neurological evaluation including CT or/and MRI. None of the subjects had a history of cerebral disease or disorder prior to their stroke, mental retardation, psychiatric disorders, psychoactive drug treatment, or habitual drug or alcohol abuse. The fulfillment of the earlier mentioned criteria was determined by interviewing both patients and relatives as well as by reviewing patients' medical records. Most of the RHD subjects were examined about 10 months following their stroke (the mean time from the onset was 306 days). Twenty-six of 30 RHD patients suffered with a left hemiparesis, whereas the 4 remaining subjects had no physical impairment. None of the RHD patients had propositional speech comprehension deficits or other aphasic disorders, but 13 out of 30 RHD subjects did have evidence of hemispatial neglect, as assessed by the Mesulam Cancellation Test (Lezak, 1995). Statistical analysis revealed no significant group differences for sex, age and years of education. The cognitive state of all individuals was assessed by the Mini Mental State Examination (MMSE) and did not differ between groups. Means and standard deviations (SD) for demographics as well as for MMSE are listed in Table 1. Subjects who had clinically relevant visual or hearing difficulties were not included. All individuals were right-handed and native speakers of Polish. NCs were examined in the same way as RHD patients. They had no neurological history and were referred to the rehabilitation center only because they had physical disabilities related to disorders of the peripheral nervous system.

Demographic characteristic

Apparatus and Procedures

The Polish adaptation of the Emotional Prosody Test (Lojek, in press; Lojek et al., 2000) from the Right Hemisphere Language Battery (RHLB) by Bryan (1995) was administered to all 61 participants. The Prosody Test is based on the work by Pell (1996) and Pell and Baum (1997). The Production of Emphatic Stress from the original English RHLB was not appropriate for the Polish language. In the Polish version of the RHLB (Lojek, in press; Lojek et al., 2000) the test consist of 16 nonsense sentences recorded on a compact disk, spoken by a professional speaker using three emotional intonations: happiness (6 sentences), sadness (5 sentences), and anger (5 sentences). However, because the Emotional Prosody Test contained 6 trials where a happy intoned sentence was presented, but all other emotions were expressed in five trials, we made the scales comparable by excluding the first prosody trial for happiness. The order in which these emotions were presented to subjects was randomized, but all subjects received the same order. Each participant was given a 21 × 27-cm card with written names of these three emotions. After listening to each sentence, the subject pointed to the word that best described the emotion that they thought was being expressed in each sentence. There was a 5-second pause between each sentence. Before the test items were presented each subject had 3 practice trials during which feedback was given to all participants. After the subject performed these practice trials, the test items were presented. The subject received one point for each correct response. During the test trials the subjects were not provided with feedback from the examiner. The reliability coefficients (r) for the Polish adaptation of Emotional Prosody Test is .691 for healthy elderly and .674 for subjects with right hemisphere damage (Lojek, in press). Additionally, the test validity was verified using factorial analysis and between-group comparisons (for review see Lojek, in press).

The ability to recognize facial emotions was assessed by presenting subjects with 15 black-and-white pictures of human faces selected from the Facial Affect Recognition Test (Ekman & Friesen, 1976; Kucharska-Pietura & Klimkowski, 2002) which was designed based on the Pictures of Facial Affect by Ekman and Friesen (Ekman & Friesen, 1976). Those pictures selected and used in our study had the highest Cronbach's alpha coefficients (i.e., most reliable) (median = .96). Each of these photographs expressed 1 of 3 basic emotions: sadness (five pictures), happiness (five pictures), or anger (five pictures). The order in which these emotions were presented was randomized, but the pictures were presented in the same order to all 61 subjects. There was no time limit for the stimulus exposure and each trial ended after the subject responded. To exclude subjects with prosopagnosia and severe visual perceptual disorders the participants were asked to identify two additional neutral faces of well-known, prominent persons and no subject used in this study demonstrated a failure to identify these faces (prosopagnosia). Participants named or pointed to the correct answer on a 21 × 27-cm card. In order to control for possible unilateral neglect, the card was centrally placed with all three emotions listed vertically. Like the Emotional Prosody Test the subject received one point for each emotional face correctly recognized.

We acknowledge that all human data included in this manuscript was obtained in compliance with regulations of our institutions, and human research was completed in accordance with the guidelines of the Helsinki Declaration (http://www.wma.net/e/policy/17-c_e.html).

RESULTS

The scores obtained by RHD patients and NC on emotional prosody and facial were compared using Mann-Whitney Test. This analysis revealed significant group differences (see Table 2) and as posited the RHD subjects' mean scores were lower for both tasks than the scores obtained by the NC participants. However, no significant difference was found between RHD patients with and without hemispatial neglect on either of the affect tests.

Group differences on emotional tasks (mean scores and SD)

To investigate the relationship between emotional perception of faces and prosody, Spearman rank order correlation analysis was performed between emotional tests' scores obtained on the auditory and visual tasks. A significant modality-channel relationship was found for RHD group (ρ = .52; p < .01). For the NC subjects a correlation could not be computed because of a ceiling effect on both the Facial Affect Recognition Test and the Emotional Prosody Test.

The next part of the analysis aimed to identify the frequency of disturbances of affective prosody as well as facial affect recognition after right hemisphere stroke. Standardized scores (z-scores) for each patient were calculated from the raw tests scores and then compared to the Sten Scale (1–10). The earlier-mentioned transformation was conducted based on means and SD values from NC group. Z scores that exceeded the threshold level of the seventh Sten (1 SD) were defined as diagnostic. Based on this statistical procedure, we found that all 30 RHD patients were impaired on tests of prosodic and facial emotion recognition. These results indicate that deficits in emotional perception are frequently observed after right hemisphere strokes.

To determine if there are modality differences (visual-facial vs. auditory-prosody) in the recognition of specific forms of emotion, non-parametric analysis for dependent variables were performed. Within-group differences between correctly identified specific emotions were assessed using Friedman Test and post-hoc testing was done using Wilcoxon Signed Ranks Test (Table 3).

Auditory and visual recognition of different emotions in NC and RHD groups (mean number correct and SD)

The response profile of RHD patients in the Emotional Prosody Test and the Facial Affect Recognition Test revealed a crossover interaction (Fig. 1). Namely, two simple effects were found in the auditory-prosody task; the RHD subjects were significantly better at correctly detecting a sad prosody than a happy prosody (z = −3.06; p < .001). In contrast, in the visual-facial task, RHD subjects were significantly better able to detect happy than sad faces (z = −3.87; p < .001). In addition, the RHD subjects were more likely to correctly detect happiness in the visual than auditory modality (z = −4.31; p < .001), but more likely to correctly identify sadness in the auditory than visual modality (z = −2.90; p < .01). No modality effect for anger, however, was observed in RHD group (z = −.031; n.s.).

Crossover interaction in RHD group (response profile in the Emotional Prosody Test and the Facial Affect Recognition Test).

Similar to the performance of the RHD group, NC subjects were better at identifying a sad prosody than a happy (z = −2.95; p < .01) or angry (z = −2.36; p < .05) prosody. However, no significant difference in the visual modality between correctly recognized happiness and sadness was observed in NC group (z = −1.92; n.s.). Moreover, for NCs only the modality effect for happiness emerged (z = −2.11; p < .05) (see Fig. 2).

Response profile of NCs in the Emotional Prosody Test and the Facial Affect Recognition Test.

We wished to further explore between group and modality effects using analyses of variance (ANOVA), but the required assumptions of ANOVA were not met, because the raw dependent variables were not normally distributed and could not be transformed. Therefore, each of the variables from both emotional tasks was ranked across all 61 participants (for review see Akritas & Arnold, 1994). An ANOVA (2 × 2) with repeated measures was then conducted, the within subject factor being the test modality (auditory for prosody vs. visual for facial affect), and the between subject factor is group (RHD and NC). The dependent variable was the total number of subjects' correct responses (recognitions) for each emotion, and these ANOVAs were performed separately for each of three emotions (happiness, sadness, and anger). Additionally, post-hoc testing of between-subjects effects was conducted using Mann-Whitney Test, whereas the Wilcoxon Signed Ranks Test was used for post-hoc within-group comparisons.

For the recognition of sadness a main effect of group emerged. Overall, the NC group had a higher number of correct responses for both prosodic and facial channels (F = 62,28; p < .001). For anger identification statistical analyzes also revealed a significant main effect of group, a higher number of correct responses for both the auditory-prosodic and visual-facial channels was observed in NC group (F = 162,44; p < .001). The analysis for happiness revealed a significant interaction (F = 23,88; p < .001). The influence of modality for the correct recognition of happiness was seen only in RHD group, with stroke patients being significantly less accurate in identifying auditory than visually presented expressions of happiness.

Post hoc testing confirmed findings obtained by ANOVAs. The RHD group, in comparison to NCs, was severely impaired on all emotional measures (p < .001) except for recognizing happiness in the visual modality (z = −1.80; n.s). Additionally, RHD within-group analyses revealed significant discrepancies in the recognition of auditory and visual happiness and sadness; RHD patients were more likely to correctly identify happy faces than happy prosody (z = −3.14; p < .01), whereas their ability to detect sadness was better preserved in auditory than visual channel (z = −2.01; p < .05)., For NC group, however, no modality effects were observed (n.s.).

Lastly, we investigated whether some patients had modality specific deficits. We calculated the difference between the number of correctly recognized emotional stimuli in auditory and visual task for both groups separately. Next, based on means and SD values from NCs, raw difference for each RHD patient was transformed into Z scores and then compared to the Sten Scale. Patients whose difference between performance on visual and auditory test expressed in Z scores exceeded the threshold level of the seventh Sten (1 SD) were defined as having modality specific deficits. Based on that procedure, and using the Wilcoxon Signed Ranks Tests, we found out that, there were 15 patients who performed relatively better in visual than auditory modality (z = −3.45; p < .001). There were also 9 RHD subjects who presented with exactly opposite pattern of performance (better auditory-prosodic than visual-facial recognition: z = −2.75; p < .01) as well as 6 other patients who did not present with modality specific deficits (z = −1.73; n.s.). Lesion sites of these individuals are described in Table 4.

Lesion location for RHD patients with an without modality specific deficits

DISCUSSION

This study was designed to better understand the mechanisms that might account for the emotional comprehension impairments observed in patients with right hemisphere strokes. Overall, these results confirm earlier studies and demonstrated that deficits in processing emotional faces and prosody are observed in patients who have suffered with right hemisphere cerebral infarctions (Borod, 2000; Heilman et al., 2003). Relative to NC subjects, the patients that suffered with a right hemisphere stroke performed significantly worse on both emotional auditory-prosody and emotional visual-facial tasks. Hence, these results appear to be consistent with previous research findings that suggest impaired emotional recognition across different modalities in subjects with right hemisphere lesions (Borod et al., 1998, 2000; Kucharska-Pietura et al., 2003; Zgaljardic et al., 2002).

Because we did not test subjects with left hemisphere damage, we could not fully assess the “valence” hypothesis that posits positive emotions are mediated by the left hemisphere and negative by the right hemisphere (Reuter-Lorenz & Davidson, 1981). On the visual-facial task, however, the RHD and NC subjects were most accurate in identifying happiness, whereas the facial recognition of anger turned out to be the most difficult for both groups. These findings support previous research showing that normal controls or RHD patients have little difficulty recognizing facial expression of happiness (Kucharska-Pietura et al., 2003). High accuracy on the happiness items of the Facial Affect Recognition Test might be attributed to the fact that happiness may be the easiest facial expression to reproduce voluntarily and might be the most frequently used emotional facial expression (Buck, 1984; Ekman & Friesen, 1976; Etcoff, 1986; Gainotti, 2000). In addition, Posamentier and Abdi (2003) reported that infants can produce the first smile somewhere between 2nd and 12th hour after birth and behaviors that are present at such an early age might be innate and mediated by subcortical structures. Richardson and coworkers (2000) studied the facial movements associated with the expression of emotions, by digitizing real time video signals in order to examine movement asymmetries across the face during emotional and non-emotional expressions. They found that the expression of happiness tended to be associated with more right sided movements. Studies of normal subjects have also reported hemispheric-specific valence effects. For instance, Reuter-Lorenz and Davidson (1981) in their tachistoscopic study found that happy faces were responded to more quickly in the right visual field, whereas sad faces were responded to more quickly in the left visual field. In a study using positron emission tomography, Pourtois and coworkers (2005) have shown that recognition of simultaneously presented visually and auditorily happy stimuli activates brain areas situated mainly anteriorly in the left hemisphere, whereas the same regions in the right hemisphere were active during the recognition of fear. The prior studies suggest that the left hemisphere might be dominant for the expression and perception of happiness, and therefore it is also possible that our subjects with RHD did best recognizing happy facial expressions because the recognition of happy faces is mediated by the left hemisphere.

Whereas our findings in the visual-facial modality are compatible with the valence hypothesis, the finding that in auditory-prosodic modality our subjects' recognition of a happy prosody was more impaired than the recognition of sadness or anger is contrary to this valence hypothesis, and this result is compatible with many other studies that also failed to support the valence hypothesis.

In our study, the RHD patients, when compared to control subjects, not only performed worse throughout the entire Emotional Prosody Test, but they were also less accurate in recognizing specific emotions presented in the auditory modality. As mentioned, in contrast to Facial Affect Recognition Test, RHD patients had only minor difficulties identifying prosody of sadness, whereas recognition in the auditory mode of happiness and anger were severely impaired. The observation that there are differences in recognition of specific emotions across modalities is inconsistent with the “specific emotion” hypothesis that suggests that within the right hemisphere emotions are organized by type (e.g., happiness, sadness, and anger). The reason why some subjects did better with some emotions in one modality than in another modality might be that some emotions require a different type of perceptual analysis than other emotions, and a disability in one type of perceptual analysis might interfere with the comprehension of some emotions more than others. For example, the two cerebral hemispheres are specialized for the processing of different specific acoustic parameters, temporal properties being preferentially processed by the left hemisphere and spectral-pitch properties being primarily processed by the right hemisphere (Robin et al., 1990; Van Lancker & Sidtis, 1992; Zatorre, 1997; Zatorre et al., 1992). Thus, the relatively well preserved identification of the sad prosody might be partly attributed to the specific acoustic profile of this particular emotion, which is associated with a slow speaking rate (Schmidt et al., 2004), that can be detected by the intact left hemisphere. The recognition of happiness, however, might be more dependent on alterations of pitch, which is mediated primarily by the right hemisphere.

One of the hypotheses that this study was designed to test suggests that emotional echoic and eidetic representations are not organized by type but are organized by modality. Support for this hypothesis would come from the observation that some subjects have modality specific deficits. In our study half of the patients were less impaired in recognizing the visual-facial emotional expressions than the auditory-prosodic emotional stimuli, whereas the opposite finding was observed in nearly one-third of all RHD subjects. Because of insufficient CT and MRI data we were unable to fully investigate the differences in lesion sites between these individuals. Although additional research is needed to determine the specific neuroanatomical substrates of these modality specific deficits, some of the previously conducted studies have already shown that correct recognition of emotions from visually presented facial expressions requires right visual and somatosensory-related cortices (Adolphs et al., 2000). For comparison, comprehension of affective prosody is most likely bound to a distinct right-hemisphere regions, which encompass the posterior superior temporal sulcus (Brodmann Area [BA] 22), as well as the dorsolateral (BA 44/45) and orbitobasal (BA 47) frontal areas (Wildgruber et al., 2005). However, there are some brain structures like the amygdala or basal ganglia that seem to contribute to the identification of both affective prosody and emotional facial expression (Cancelliere & Kertesz, 1990; Heilman & Gilmore, 1998; Sander et al., 2005). Therefore, functional imaging studies investigating neuroanatomical correlates of emotional perception may be able to establish which neural systems are most critical for recognition of emotional stimuli.

In our study, a positive correlation for RHD subjects between the facial and prosodic affect tests was obtained. This finding is consistent with previous research results suggesting an association between facial and prosodic communication channels for RHD patients (Borod et al., 1990; Zgaljardic et al., 2002). The association between modalities revealed in this study gives farther support for the hypotheses that a general emotional processor mediates perception of facial and prosodic affect.

This study has several limitations that need to be considered. Although the cognitive state of all participants was assessed with the MMSE, we cannot fully rule out the possibility that subtle perceptual/cognitive deficits might have contributed, at least in part, to the emotion-specific deficits. We did not test a control group with left hemisphere damage, and therefore we are not entirely able to conclude that our findings are caused by specific effects of right hemisphere stroke as opposed to non-specific effects of central nervous system injury. Another weakness of our study is that some of the results may caused by the fact that the tests used in this study might not have been completely matched in terms of the overall difficulty level. Unfortunately, this limitation may be a factor in many previous studies as well, particularly with respect to the prosodic channel (for review see Borod, 1992). Additionally, we cannot fully rule out the possibility that the differences between happiness and sadness obtained in our study are because of the stimuli used.

Taking into account all the earlier-mentioned limitations, future studies may benefit not only from more comprehensive analysis of neuroimaging data but also from using more specific neuropsychological testing, and from including an additional brain-damaged control group, particularly with LHD, to univocally explain the nature of emotional perception impairment after RHD. This study, however, does support the postulate that right hemisphere cerebral infarction leads to emotional perception-comprehension impairments in the visual-facial and auditory-prosody channels. Our results also indicate that there is a significant relationship between the facial and prosodic channels, but the deficits in identification of emotions do not seem to be equally severe for different types of emotions across these channels, suggesting some modality specificity. These results suggest that, although a general emotional perception processor might exist, the identification of specific emotions depends mainly on the characteristic features of the emotional stimuli.

ACKNOWLEDGMENTS

A portion of this study was presented at the 33rd annual meeting of the International Neuropsychological Society in St. Louis, Missouri, February 2005. The authors thank Dr. Kathleen Haaland and three anonymous reviewers for their helpful comments on an earlier version of this manuscript. The Authors acknowledge that this manuscript is original and is not currently under review by any other publication. This manuscript has never been published either electronically or in print as well as there is no financial relationship with any institution that could be perceived as a potential conflict of interests.

References

REFERENCES

Adolphs, R., Damasio, H., & Tranel, D. (2002). Neural systems for recognition of emotional prosody: A 3-D lesion study. Emotion, 2, 2351.CrossRefGoogle Scholar
Adolphs, R., Damasio, H., Tranel, D., Cooper, G., & Damasio, A.R. (2000). A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping. Journal of Neuroscience, 20, 26832690.CrossRefGoogle Scholar
Adolphs, R., Tranel, D., & Damasio, H. (2001). Emotion recognition from faces and prosody following temporal lobectomy. Neuropsychology, 15, 396404.CrossRefGoogle Scholar
Akritas, M.G. & Arnold, S.F. (1994). Fully non-parametric hypotheses for factorial design. Part I: Multivariate repeated measures design. Journal of the American Statistical Association, 89, 336343.Google Scholar
Barrett, A.M., Crucian, G.P., Raymer, A.M., & Heilman, K.M. (1999). Spared comprehension of emotional prosody in a patient with global aphasia. Neuropsychiatry, Neuropsychology and Behavioral Neurology, 12, 117120.Google Scholar
Blonder, L.X., Bowers, D., & Heilman, K.M. (1991). The role of the right hemisphere in emotional communication. Brain, 114, 11151127.CrossRefGoogle Scholar
Borod, J.C. (1992). Interhemispheric and intrahemispheric control of emotion: A focus on unilateral brain damage. Journal of Consulting and Clinical Psychology, 60, 339348.CrossRefGoogle Scholar
Borod, J.C. (Ed.). (2000). Neuropsychology of Emotion. New York–Oxford: Oxford University Press.
Borod, J.C., Obler, L.K., Erhan, H.M., Grunwald, I.S., Cicero, B.A., Welkowitz, J., Santschi, C., Agosti, R.M., & Whalen, J.R. (1998). Right hemisphere emotional perception: Evidence across multiple channels. Neuropsychology, 12, 446458.CrossRefGoogle Scholar
Borod, J.C., Pick, L.H., Hall, S., Sliwinski, M., Madigan, N., Obler, L.K., Welkowitz, J., Canino, E., Erhan, H.M., Goral, M., Morrison, C., & Tabert, M. (2000). Relationships among facial, prosodic, and lexical channels of emotional perceptual processing. Cognition and Emotion, 14, 193211.CrossRefGoogle Scholar
Borod, J.C., Welkowitz, J., Alpert, M., Brozgold, A.Z., Martin, C., Peselow, E., & Diller, L. (1990). Parameters of emotional processing in neuropsychiatric disorders: Conceptual issues and a battery of tests. Journal of Communication Disorders, 23, 247271.CrossRefGoogle Scholar
Bowers, D., Bauer, R.M., Coslett, H.B., & Heilman, K.M. (1985). Processing of faces by patients with unilateral hemisphere lesions. I. Dissociation between judgments of facial affect and facial identity. Brain and Cognition, 4, 258272.CrossRefGoogle Scholar
Bowers, D., Blonder, L., Feinberg, T., & Heilman, K.M. (1991). Differential impact of right and left hemisphere lesions on facial emotion versus object imagery. Brain, 114, 25932609.CrossRefGoogle Scholar
Bowers, D., Coslett, H.B., Bauer, R.M., Speedie, L.J., & Heilman, K.M. (1987). Comprehension of emotional prosody following unilateral hemispheric lesions: Processing defect versus distraction defect. Neuropsychologia, 25, 317328.CrossRefGoogle Scholar
Bryan, K.L. (1995). The Right Hemisphere Language Battery (2nd ed.). London: Whurr Publishers.
Buck, R. (1984). Communication of Emotion. New York: Guilford Press.
Buck, R. & Duffy, R.J. (1980). Nonverbal communication of affect in brain-damaged patients. Cortex, 16, 351362.CrossRefGoogle Scholar
Cancelliere, A.E. & Kertesz, A. (1990). Lesion localization in acquired deficits of emotional expression and comprehension. Brain and Cognition, 13, 133147.CrossRefGoogle Scholar
Davidson, R.J., Ekman, P., Saron, C.D., Senulis, J.A., & Friesen, W.V. (1990). Approach-withdrawal and cerebral asymmetry: Emotional expression and brain physiology: I. Journal of Personality and Social Psychology, 58, 330341.CrossRefGoogle Scholar
Davidson, R.J., Shackman, A.J., & Maxwell, J.S. (2004). Asymmetries in face and brain related to emotion. Trends In Cognitive Sciences, 8, 389391.CrossRefGoogle Scholar
DeKosky, S.T., Heilman, K.M., Bowers, D., & Valenstein, E. (1980). Recognition and discrimination of emotional faces and pictures. Brain and Language, 9, 206214.CrossRefGoogle Scholar
Ekman, P. & Friesen, W.V. (1976). Measuring facial movements. Journal of Environmental Psychology, 1, 5675.CrossRefGoogle Scholar
Esslen, M., Pascual-Marqui, R.D., Hell, D., Kochi, K., & Lehmann, D. (2004). Brain areas and time course of emotional processing. Neuroimage, 21, 11891203.CrossRefGoogle Scholar
Etcoff, N. (1986). The neuropsychology of emotional expression. In G. Goldstein & R.E. Tarter (Eds.), Advances In Clinical Neuropsychology. New York: Plenum Press.
Gainotti, G. (2000). Neuropsychological theories of emotion. In J.C. Borod (Ed.), Neuropsychology of Emotion. New York–Oxford: Oxford University Press.
Habib, M. (1986). Visual hypoemotionality and prosopagnosia associated with right temporal lobe isolation. Neuropsychologia, 24, 577582.CrossRefGoogle Scholar
Heilman, K.M. & Gilmore, R.L. (1998). Cortical influences in emotion. Journal of Clinical Neurophysiology, 15, 409423.CrossRefGoogle Scholar
Heilman, K.M., Scholes, R., & Watson, R.T. (1975). Auditory affective agnosia. Disturbed comprehension of affective speech. Journal of Neurology, Neurosurgery and Psychiatry, 38, 6972.CrossRefGoogle Scholar
Heilman, K.M., Blonder, L.X., Bowers, D., & Valenstein, E. (2003). Emotional disorders associated with neurological diseases. In K.M. Heilman & E. Valenstein (Eds.), Clinical neuropsychology. New York–Oxford: Oxford University Press.
Kucharska-Pietura, K. & Klimkowski, M. (2002). Clinical aspects of emotions in both healthy and injured brain. Cracow: Medical Publisher (in Polish).
Kucharska-Pietura, K., Phillips, M.L., Gernand, W., & David, A.S. (2003). Perception of emotions from faces and voices following unilateral brain damage. Neuropsychologia, 41, 10821090.CrossRefGoogle Scholar
Lane, R.D., Reiman, E.M., Ahern, G.L., Schwartz, G.E., & Davidson, R.J. (1997). Neuroanatomical correlates of happiness, sadness, and disgust. American Journal of Psychiatry, 154, 926933.Google Scholar
Lezak, M. (1995). Neuropsychological Assessment. New York–Oxford: Oxford University Press.
Lojek, E., (in press). The Right Hemisphere Language Battery—Polish version (RHLB-Pl). Manual. Warsaw, Psychological Test Laboratory of the Polish Psychological Association (in Polish).
Lojek, E., Skotnicka, M., & Bryan, K.L. (2000). A battery of neuropsychological tests for the assessment of language disorders in right brain damaged patients: Preliminary results. Polish Psychological Bulletin, 31, 279289.Google Scholar
Mandal, M.K., Borod, J.C., Asthana, H.S., Mohanty, A., Mohanty, S., & Koff, E. (1999). Effects of lesion variables and emotion type on the perception of facial emotion. The Journal of Nervous and Mental Disease, 187, 603609.CrossRefGoogle Scholar
Pell, M.D. (1996). On the receptive prosodic loss in Parkinson's disease. Cortex, 32, 693704.CrossRefGoogle Scholar
Pell, M.D. & Baum, S.R. (1997). The ability to perceive and comprehend intonation in linguistic and affective contexts by brain-damaged adults. Brain and Language, 57, 8099.CrossRefGoogle Scholar
Pourtois, G., de Gelder, B., Bol, A., & Crommelinck, M. (2005). Perception of facial expressions and voices and of their combination in the human brain. Cortex, 41, 4959.CrossRefGoogle Scholar
Posamentier, M.T. & Abdi, H. (2003). Processing faces and facial expressions. Neuropsychology Review, 13, 113143.CrossRefGoogle Scholar
Reuter-Lorenz, P. & Davidson, R.J. (1981). Differential contributions of the two cerebral hemispheres to the perception of happy and sad faces. Neuropsychologia, 19, 609613.CrossRefGoogle Scholar
Richardson, C.K., Bowers, D., Bauer, R.M., Heilman, K.M., & Leonard, C.M. (2000). Digitizing the moving face during dynamic displays of emotion. Neuropsychologia, 38, 10281039.CrossRefGoogle Scholar
Robin, D.A., Tranem, D., & Damasio, H. (1990). Auditory perception of temporal and spectral events in patients with focal left and right cerebral lesions. Brain and Language, 39, 539555.CrossRefGoogle Scholar
Ross, E.D. (1981). The aprosodias. Functional-anatomic organization of the affective components of language in the right hemisphere. Archives of Neurology, 38, 561569.CrossRefGoogle Scholar
Sander, D., Grandjean, D., Pourtois, G., Schwartz, S., Seghier, M.L., Scgerer, K.R., & Vuilleumier, P. (2005). Emotion and attention interactions in social cognition: Brain regions involved in processing anger prosody. Neuroimage, 28, 848858.CrossRefGoogle Scholar
Schmidt, J., Borod, J., Foldi, N.S., & Sheth, S.J. (2004). Acoustical analysis and human ratings of emotional prosody in unilateral stroke (abstract). In 32nd INS Annual Meeting Program and Abstracts, p. 152. Baltimore.
Schmitt, J.J., Hartje, W., & Willmes, K. (1997). Hemispheric asymmetry in the recognition of emotional attitude conveyed by facial expression, prosody and propositional speech. Cortex, 33, 6581.CrossRefGoogle Scholar
Tucker, D.M., Watson, R.T., & Heilman, K.M. (1977). Discrimination and evocation of affectively intoned speech in patients with right parietal disease. Neurology, 27, 947950.CrossRefGoogle Scholar
Van Lancker, D. & Sidtis, J.J. (1992). The identification of affective-prosodic stimuli by left- and right-hemisphere-damaged subjects: All errors are not created equal. Journal of Speech and Hearing Research, 35, 963970.CrossRefGoogle Scholar
Wildgruber, D., Riecker, A., Hertich, I., Erb, M., Grodd, W., Ethofer, T., & Ackermann, H. (2005). Identification of emotional intonation evaluated by fMRI. Neuroimage, 24, 12331241.CrossRefGoogle Scholar
Wymer, J.H., Lindan, L.S., & Booksh, R.L. (2002). A neuropsychological perspective of aprosody: Features, function, assessment, and treatment. Applied Neuropsychology, 9, 3747.CrossRefGoogle Scholar
Zatorre, R.J. (1997). Functional neuroimaging in the study of the human auditory cortex. AJNR. American journal of neuroradiology, 18, 621623.Google Scholar
Zatorre, R.J., Evans, A.C., Meyer, E., & Gjedde, A. (1992). Lateralization of phonetic and pitch discrimination in speech processing. Science, 256, 846849.CrossRefGoogle Scholar
Zgaljardic, D.J., Borod, J.C., & Sliwinski, M. (2002). Emotional perception in unilateral stroke patients: Recovery, test stability, and interchannel relationships. Applied Neuropsychology, 9, 159172.CrossRefGoogle Scholar
Figure 0

Demographic characteristic

Figure 1

Group differences on emotional tasks (mean scores and SD)

Figure 2

Auditory and visual recognition of different emotions in NC and RHD groups (mean number correct and SD)

Figure 3

Crossover interaction in RHD group (response profile in the Emotional Prosody Test and the Facial Affect Recognition Test).

Figure 4

Response profile of NCs in the Emotional Prosody Test and the Facial Affect Recognition Test.

Figure 5

Lesion location for RHD patients with an without modality specific deficits