INTRODUCTION
Traditionally, the focus of cognitive neuropsychology has been on the functions subserved by cortical gray matter. However, more recent models of brain function recognize that the connectivity provided by white matter is critical for information transfer between distributed neural networks, and consequently, it is also central to the efficient operation of higher brain function. Indeed, Filley (Reference Filley2005) has argued that white matter neuropathology is a primary or prominent feature of more than 100 clinical disorders, and may also be one mechanism underpinning age-related change in cognitive and emotional processing. Recent empirical studies are consistent with this possibility, showing that white matter pathology may be relevant to understanding age-related deficits in working memory (Nordahl et al., Reference Nordahl, Ranganath, Yonelinas, DeCarli, Fletcher and Jagust2006), face perception (Thomas et al., Reference Thomas, Moya, Avidan, Humphreys, Jung, Peterson and Behrmann2008), and facial affect recognition (Ruffman et al., Reference Ruffman, Henry, Livingstone and Phillips2008).
White matter pathology has the potential to disrupt any facet of cognitive function, but appears to particularly affect operations subserved by frontal-subcortical networks, such as executive control, memory retrieval, information processing speed, working memory, and sustained attention (Beers et al., Reference Beers, Goldstein and Katz1994; Filley, Reference Filley2005; Stuss & Gow, Reference Stuss and Gow1992). This profile of cognitive impairment has been attributed to both the predilection white matter disease shows for frontal white matter, and the dense connectivity between frontal and other regions, with damaged white matter leading to slowed communication between distributed neural networks (Filley, Reference Filley2005).
White matter pathology may also affect important aspects of emotion understanding, such as facial affect recognition or theory of mind (ToM). There are a wide range of neural systems involved in recognizing facial expressions of emotion, with frontal and temporal systems mainly involved (see Ruffman et al., Reference Ruffman, Henry, Livingstone and Phillips2008; Sprengelmeyer et al., Reference Sprengelmeyer, Rausch, Eysel and Przuntek1998). Adolphs et al. (Reference Adolphs, Damasio, Tranel, Cooper and Damasio2000) have argued that the integrity of white matter structures connecting neuroanatomically distributed cortices are critical for decoding facial expressions of affect. Consistent with this possibility, Green et al. (Reference Green, Turner and Thompson2004) showed that individuals with traumatic brain injury (a group for whom diffuse axonal injury is often observed) were impaired on a measure of facial affect recognition, irrespective of whether they had sustained focal damage to the specific brain regions considered key for facial affect recognition to occur. Deficits in ToM, the ability to understand others’ thoughts and feelings, may also be anticipated in the context of white matter pathology because of its reliance on widespread cortical neural networks that again include frontal and temporal systems (Apperly et al., Reference Apperly, Samson, Chiavarino and Humphreys2004; Decety & Jackson, Reference Decety and Jackson2004). It is of note that functional and anatomical neural underconnectivity has been identified in autism, a disorder for which ToM impairment is one of the defining features (see Just et al., Reference Just, Cherkassky, Keller, Kana and Minshew2007).
It is therefore surprising that, although multiple sclerosis (MS) is a prototype white matter disorder, and specifically, “the most important demyelinative disease of the central nervous system” (Filley, Reference Filley2005, p. 163), no study to date has assessed how MS affects capacity for important emotion understanding skills such as facial affect recognition or ToM. This omission is made yet more surprising given that MS is associated with neurocognitive difficulties that are typical of white matter disease, with deficits most consistently observed on measures sensitive to memory retrieval, processing speed, and executive control (Arnett et al., Reference Arnett, Rao, Grafman, Bernardin, Luchetta, Binder and Lobeck1997; Beatty et al., Reference Beatty, Goodkin, Beatty and Monson1989; Bobholz & Rao, Reference Bobholz and Rao2003; Bobholz et al., Reference Bobholz, Rao, Lobeck, Elsinger, Elsinger, Gleason, Kanz, Durgerian and Maas2006; Henry & Beatty, Reference Henry and Beatty2006). Furthermore, in the only study to date that assessed the ability to comprehend emotional information, Beatty et al. (Reference Beatty, Orbelo, Sorocco and Ross2003) found that individuals with MS were impaired at identifying emotional states from prosodic cues, and suggested that, “For some patients with MS, deficits in comprehending emotional information may contribute to their difficulties in maintaining effective social interactions” (p. 148). The primary aim of the present study is therefore to test whether capacity for facial affect recognition and ToM is disrupted in the context of MS.
In service of this goal, the ability to assess basic emotions from photographs of faces was assessed, along with the ability to detect subtle differences in mental states from photographs of eyes. The former measure involves decoding of basic emotions (facial affect recognition), the latter more complex emotions or thoughts that often concern social interaction for example, distinctions include attraction or repulsion, friendly or hostile, noticing you or ignoring you (Mind in the Eyes test). Baron-Cohen et al. (Reference Baron-Cohen, Jolliffe, Mortimore and Robertson1997) describe the Mind in the Eyes test as one of mindreading.
Importantly, despite their apparent similarity (both measures involve being able to identify others’ emotions from facial information), there is evidence that these two measures tap dissociable abilities. Thus, although some studies have suggested that performance on ToM tasks is related to the ability to identify facial expressions of emotion (e.g., Buitelaar et al., Reference Buitelaar, van der Wees, Swaab-Barneveld and van der Gaag1999), in a study of normal adult aging, Phillips et al. (Reference Phillips, MacLean and Allen2002) found that older adults were significantly impaired on the Mind in the Eyes test, but not facial affect recognition, with performance on the two measures uncorrelated (r = .08). Furthermore, Henry et al. (Reference Henry, Phillips, Crawford, Iatswaart and Summers2006) found that although individuals who had sustained traumatic brain injury exhibited a comparable level of impairment on both measures, only for the control group was performance on the two tasks correlated.
The second aim is to assess whether any observed difficulties in affect recognition and ToM are related to more general neurocognitive impairment. In particular, the ability to understand others’ thoughts, motivations, and emotions (i.e., ToM) imposes substantial demands on cognitive control processes, such as inhibition and mental flexibility (Apperly et al., Reference Apperly, Samson, Chiavarino and Humphreys2004; Bailey & Henry, Reference Bailey and Henry2008; Decety & Jackson, Reference Decety and Jackson2004). Emotion labeling tasks have also been shown to place a high load on effortful strategic processing, and specifically, working-memory updating (Phillips et al., Reference Phillips, Channon, Tunstall, Hedenstrom and Lyons2008). In the context of MS, therefore, any observed deficits in ToM and facial affect recognition may relate to more general cognitive difficulties. The present study will also test this possibility. Emotion understanding skills such as facial affect recognition and ToM are critical for interpersonal functioning, with difficulties understanding others’ emotions linked to various types of social function impairment, including reduced social competence, poor communication, reduced quality of life, and inappropriate social behaviour (Carton et al., Reference Carton, Kessler and Pape1999; Feldman et al., Reference Feldman, Philippot, Custrini, Feldman and Rime1991; Spell & Frank, Reference Spell and Frank2000). It is therefore important to understand how these processes are affected in MS, as well as the correlates of any observed impairment.
METHODS
Participants
For the MS sample, individuals were recruited by two different means: advertising in a newsletter sent to individuals registered with the MS Society of New South Wales and sending individualized invitation letters on behalf of the research team to 100 registered individuals who lived within close proximity to the testing sites, but who otherwise were randomly selected. There was a 25% return rate for the latter recruitment method. Registration with the MS Society of New South Wales requires that the applicant have a confirmed diagnosis of MS as determined by a state-registered neurologist.
The final sample included 27 participants with MS. The mean years since diagnosis was 7.0 (SD = 6.08). The mean age of symptom onset was 28.2 years (SD = 14.21), and the mean age of diagnosis was 39.6 years (SD = 10.55). Participants’ mean score on the Disease Steps (Hohol et al., Reference Hohol, Orav and Weiner1995), which ranges from 0 (normal) to 6 (confined to wheelchair), was 1.9 (SD = 1.98), indicative of moderate mobility disability. The Disease Steps correlates strongly (r = .96) with the physician-encoded disability rating scale, the Kurzke Expanded Disability Status Scale (EDSS; Hohol et al., Reference Hohol, Orav and Weiner1995).
Control participants (n = 30) matched for age (+ or – 3 years) and education were recruited from the general community via advertisements and word of mouth. The mean age in years of the two groups did not differ significantly, t(55) = 0.99, p = .33. The MS group ranged in age from 26 to 64 years (M = 47.0, SD = 11.01) and the control group from 27 to 60 years (M = 44.3, SD = 9.55). Both the MS and control groups were predominantly women (66.7% and 63.3%, respectively), and did not significantly differ in the number of years of education [M = 15.0, SD = 3.44 and M = 14.8, SD = 2.57, respectively; t(55) = 0.32, p = .75]. Overall, the characteristics of the MS participants were broadly comparable with those from a recent database sample including 2618 respondents with a diagnosis of MS in New South Wales (see Tribe et al., Reference Tribe, Longley, Fulcher, Faine, Blagus, Pearce, Hope and Henderson2006).
Exclusion criteria for both groups were: (a) a history of neurological disease (other than MS for the MS participants), (b) history of major psychiatric illness, (c) a premorbid history of alcohol or drug abuse, (d) motor disturbances that would interfere with testing. Because both measures of emotion understanding used in the present study also impose demands on perceptual function, for the MS participants, vision testing was conducted binocularly (with both eyes open) using standard Snellen acuity letter charts. Tests for opthalmoplegia were also conducted. None of the MS participants included in the study had visual disturbances that would interfere with testing. Participants were given $30 Australian dollars (~$24 USD). All participants provided informed consent, and the study was approved by the Royal Rehabilitation Centre Sydney Ethics Committee and was in compliance with the regulations of the University of New South Wales.
Procedure and Measures
All participants first provided demographic information, then completed a self-report measure of depression, and several cognitive measures. This was followed by measures of facial affect recognition and ToM, described in detail later.
The first cognitive measure used was the Screening Examination for Cognitive Impairment (SEFCI; Beatty et al., Reference Beatty, Paul, Wilbanks, Hames, Blanco and Goodkin1995). The SEFCI has high reliability and is sensitive to the presence of cognitive impairment in MS (Beatty et al., Reference Beatty, Paul, Wilbanks, Hames, Blanco and Goodkin1995), and was included in the present study to clarify the magnitude of any impairment exhibited by the MS group on the primary dependent measures of interest (Facial Affect Recognition and the Mind in the Eyes test), relative to those observed on more routinely assessed cognitive domains. The SEFCI includes the following four cognitive measures:
1. Delayed recall. Subjects were read a list of 10 unrelated nouns and asked to recall as many as possible. Three such learning trials were given. About 10 minutes after the last learning trial, a single delayed-recall trial was given; Performance on this delayed-recall trial (SEFCI short delay) was the dependent measure of interest from this component of the SEFCI.
2. Vocabulary. The 40-item multiple-choice vocabulary test from the Shipley Institute of Living Scale (SILS; Zachary, Reference Zachary1986) was used to index vocabulary. For each of the 40 target words, participants were asked to identify which of four words was closest in meaning. This measure is self-administered with a 10-minute time limit.
3. Abstraction. The 20-item serial completion test of verbal abstraction from the Shipley Institute of Living Scale (SILS; Zachary, Reference Zachary1986) was used to index abstracting ability. For each of the 20 items, participants are required to infer the missing final chain of a sequence (e.g., an example is escape scape cape , with the participant required to abstract that the missing chain of the sequence is ‘ape’). The items become progressively harder. This measure is self-administered with a 10-minute time limit.
4. Information processing speed. The Symbol Digit Modalities Test (SDMT; Smith, Reference Smith1982) requires participants to orally substitute digits for geometric symbols shown in a key at the top of the page. The dependent variable is the number of digits correctly substituted for symbols in 90 seconds. The SDMT provides an index of information processing speed.
Each participant also completed measures of phonemic and semantic fluency (using the probes: F, A and S, and animals, fruits, and vegetables, respectively). Tests of verbal fluency are among the most validated indicators of executive dysfunction, but they also impose substantial demands on information processing speed and language skills (Crawford & Henry, Reference Crawford, Henry, Halligan and Wade2005; Henry & Crawford, Reference Henry and Crawford2004). In the context of MS specifically, measures of verbal fluency are among the most sensitive markers of cognitive dysfunction (Henry & Beatty, Reference Henry and Beatty2006). For each fluency probe, participants were given one minute to produce as many exemplars as possible. Participants’ responses were recorded on an audio-cassette recorder. The dependent measure was the total number of responses, minus repetitions and inappropriate responses.
The Geriatric Depression Scale (GDS; Yesavage et al., Reference Yesavage, Brink, Rose, Lum, Huang, Adey and Leirer1983) was used to measure depressive symptoms. Scores on this measure range from 0 to 30, with higher scores indicative of greater depression. The advantage of using the GDS in this population is that it minimizes reference to somatic symptoms that may be indicative of physical disability (rather than psychological distress); see Chalfont et al. (Reference Chalfont, Bryant and Fulcher2004) and Mohr et al. (Reference Mohr, Goodkin, Likosky, Beutler, Gatto and Langan1997) for further details relating to use of the GDS in the context of MS specifically.
To index facial affect recognition, participants were presented with a sequence of 48 photographs from the black and white Ekman and Friesen (Reference Ekman and Friesen1976) stimulus set, eight each of anger, happiness, fear, disgust, sadness, and surprise. Participants are required to choose the label that best describes the emotion displayed in each face. These are the most widely used facial affect stimuli, and have been extensively validated. Percentage accuracy for each of the six target emotions was the dependent variable of interest.
The revised Mind in the Eyes test (Baron-Cohen et al., Reference Baron-Cohen, Wheelwright, Hill, Raste and Plumb2001) was used to index ToM. This measure requires participants to select which of four words best describes the thoughts or feelings expressed in 36 pictures of eyes. The Mind in the Eyes test differs from identifying basic facial expressions of emotion in that the distinctions made involve more complex emotional terms and often concern social interaction (e.g., distinctions include attraction or repulsion, friendly or hostile, noticing you or ignoring you). The Mind in the Eyes test is sensitive to the presence of autism (Baron-Cohen et al., Reference Baron-Cohen, Jolliffe, Mortimore and Robertson1997; Kaland et al., Reference Kaland, Callesen, Moller-Nielsen, Mortensen and Smith2008), and localized amygdala damage (Adolphs et al., Reference Adolphs, Baron-Cohen and Tranel2002).
RESULTS
In Table 1, the Ms and SDs and the results of inferential statistical tests comparing participants with MS and controls are presented for the SEFCI, GDS, Mind in the Eyes test, and measures of verbal fluency. Applying Bonferonni corrections to the eight pair-wise comparisons yields an adjusted critical p value of .006: Using this criterion, it can be seen that significant MS deficits were observed for SEFCI short delay, phonemic fluency, and the Mind in the Eyes test. None of the other group contrasts attained significance.a
Table 1. Performance of control and multiple sclerosis (MS) participants on the measures of cognitive function, depression, and theory of mind (values reported are raw scores)
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160713223228-74414-mediumThumb-S1355617709090195_tab1.jpg?pub-status=live)
Note.
SEFCI refers to the Screening Exam for Cognitive Impairment; SDMT refers to the Symbol Digit Modalities Test; GDS refers to the Geriatric Depression Scale.
Figure 1 shows the percentage correct for MS and control participants on the measure of facial affect recognition. These data were analyzed with a 2 × 6 mixed ANOVA with the between-subjects variable of group (MS, control) and the within-subjects variable of emotion type (anger, disgust, fear, sad, surprise, happy). The results indicated that there was no main effect of group, F(1, 55) = 2.19, p =.144, η2 = .04, but that there was an interaction between group and emotion type, F(5, 275) = 3.44, p = .005, η2 = .06. To pursue the locus of this effect, and identify which specific emotions differed between the two groups, a series of t tests were carried out. These revealed that recognition of anger (p = .024) and fear (p = .031) were disrupted in the MS group, but that there were no group differences in the recognition of surprise (p = .51), sadness (p = .54), disgust (p = .29), or happiness (p = .15).
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160713223228-80706-mediumThumb-S1355617709090195_fig1g.jpg?pub-status=live)
Fig. 1. Percentage of correct identification for each of the six basic emotions for the control and multiple sclerosis (MS) participants.
Pearson product-moment correlations were then computed to assess whether performance on the measures of facial affect recognition and ToM were related—these indicated that performance on the two tasks was significantly correlated for both groups (rs = .47 and .61 for the control and MS groups, respectively). The scatterplots (with regression line included) depicting these relationships are presented as Figures 2a and 2b for the control and MS groups, respectively. Thus, there appears to be considerable overlap in the abilities tapped by these tasks, particularly for the MS group. However, although both of these correlations are substantial, the majority of variance tapped by each measure constitutes unique variance (i.e., over 75% and 60%, respectively). Consequently, for the final set of analyses reported below, the distinction between these two measures was retained.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160713223228-01464-mediumThumb-S1355617709090195_fig2g.jpg?pub-status=live)
Fig. 2. Scatterplots (with regression line included) depicting the relationships between performance on the measures of facial affect recognition and theory of mind for control and multiple sclerosis (MS) participants, respectively
Finally, it was assessed whether performance on either of the emotion understanding measures were related to more general cognitive function and mood parameters. These data are presented in Table 2, and show that for both groups, the most consistent associations were with measures that impose demands on executive control (abstraction and fluency) or processing speed (SDMT). In the MS group, poor scores on fluency were significantly associated with difficulties in Mind in the Eyes test performance, whereas processing speed was correlated with facial affect recognition. For the control group, poorer abstraction and fluency scores related to performance on both tests, whereas processing speed only correlated with Mind in the Eyes test performance. Several of these relationships are depicted graphically as scatter plots in Figure 3, with regression lines included.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160713223228-35500-mediumThumb-S1355617709090195_fig3g.jpg?pub-status=live)
Fig. 3. Scatterplots (with regression line included) depicting the relationships between performance on the following: (a) facial affect recognition and Symbol Digit Modalities Test in the multiple sclerosis (MS) group; Mind in the Eyes test and composite fluency in (b) the control group, and (c) the MS group; facial affect recognition and composite fluency in (d) the control group, and (e) the MS group.
Table 2. Correlations between Mind in the Eyes test and facial affect recognition (FAR) test performance with the other cognitive measures for the control and multiple sclerosis (MS) groups
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160713223228-53087-mediumThumb-S1355617709090195_tab2.jpg?pub-status=live)
Note.
SEFCI = Screening Exam for Cognitive Impairment, SDMT = Symbol Digit Modalities Test, GDS = Geriatric Depression Scale.
† p = .08; *p < .05; **p < .01.
1 Because phonemic and semantic fluency were highly correlated (r = .51, p < .001), a composite fluency measure was created to reduce the number of analyses.
DISCUSSION
The present study assessed capacity for facial affect recognition and ToM in MS and provides clear evidence for increased difficulties in both these aspects of emotion understanding. Thus, relative to controls, participants with MS had greater difficulty detecting subtle differences in mental states from pictures of eyes, and although no overall group difference in facial affect recognition was identified, specific difficulties were observed in decoding two of the facial emotions (anger and fear). These data are therefore consistent with the broader literature showing that white matter pathology has the potential to disrupt not only cognitive function parameters, but also important social perceptual skills.
Furthermore, whilst cognitive abnormalities appear to be related to the elevated incidence of affective disturbance in MS (Arnett et al., Reference Arnett, Higginson, Voss, Bender, Wurst and Tippin1999; Beeney & Arnett, Reference Beeney and Arnett2008), these data are the first to show that cognitive ability also correlates with important aspects of emotion understanding. In the MS group, facial affect recognition was significantly correlated with the SDMT, and both measures of emotion understanding were related to verbal fluency. Because measures of verbal fluency are one of the most validated measures of executive control (see, e.g., Crawford & Henry, Reference Crawford, Henry, Halligan and Wade2005; Henry & Crawford, Reference Henry and Crawford2004), and the SDMT is sensitive to information processing speed, these data indicate that deficits in executive functioning and the ability to process information quickly are related to emotion understanding difficulties in MS. Longitudinal work would represent an important supplement to the current findings to assess whether these covariances reflect a statistical association between white matter disease in brain regions involved in emotion understanding and cognitive processing, or whether cognitive difficulties themselves contribute to emotion understanding difficulties in this population.
The results of the present study also support the possibility that MS may particularly disrupt decoding of specific facial emotions. In the neuropsychological literature, considerable emphasis has been placed on the potential role of dissociable neural substrates in recognizing specific emotions (Adolphs et al., Reference Adolphs, Tranel, Damasio and Damasio1994; Calder et al., Reference Calder, Keane, Lawrence and Manes2004), and there is evidence that different types of neural dysfunction can differentially influence the ability to recognize specific emotions. For example, Parkinson’s disease can cause specific impairment in labeling disgusted facial expressions, but relatively intact labeling of other emotions (Suzuki et al., Reference Suzuki, Hoshino, Shigemasu and Kawamura2006), whereas early Alzheimer’s disease can impair labeling of all emotions except disgust (Henry et al., Reference Henry, Ruffman, McDonald, Peek-O’Leary, Phillips, Brodaty and Rendell2008). Consequently, one possibility is that deficits in recognizing anger and fear may reflect the specific neural involvement of the brain regions thought to be implicated in decoding these facial emotions in the participants sampled. Because the focality of MS lesions are highly variable, and can affect any region, other MS groups may present with a different profile of emotion recognition difficulties. This possibility could be directly tested by directly mapping underlying MS-related neuropathology (and specifically, the focality of MS lesions) onto affect recognition performance in this group.
Limitations and Future Directions
A clear limitation of the present study was the absence of specific information relating to neuropathological change in the MS participants tested. Consequently, future research should directly identify possible links between underlying MS-related neuropathology, not only with facial affect recognition, but also with ToM. This seems particularly important given that, although white matter pathology is the primary neurological feature of MS, demyelination in cortical gray matter is also observed to a variable extent, and the focality of demyelinating lesions in MS can vary widely (Filley Reference Filley2005; Tedeschi et al., Reference Tedeschi, Lavorgna, Russo, Prinster, Dinacci, Savettieri, Quattrone, Livrea, Messina, Reggio, Bresciamorra, Orefice, Paciello, Brunetti, Coniglio, Bonavita, Di Costanzo, Bellacosa, Valentino, Quarantelli, Patti, Salemi, Cammarata, Simone, Salvatore, Bonavita and Alfano2005).
These data also highlight the need for further research to explore the association between ToM and specific executive control processes that are not liable to be strongly tapped by measures of verbal fluency. In particular, reduced inhibitory control has been linked to ToM difficulties in other populations (e.g., Bailey & Henry, Reference Bailey and Henry2008; Samson et al., Reference Samson, Apperly, Kathirgamanathan and Humphreys2005), and concurrent engagement in an inhibitory task has been shown to cause specific problems decoding ToM as indexed by the Mind in the Eyes task (Bull et al., Reference Bull, Phillips and Conway2008). It would therefore be of considerable interest to assess whether this relationship was also observed in the context of MS.
ToM is a multifaceted construct, and Tager-Flusberg (Reference Tager-Flusberg, Burack, Charman, Yirmiya and Zelano2001) in particular distinguishes between social-perceptual and social-cognitive components. Whereas the former refers to the immediate online judgment of a person’s mental state based on perceptual features (i.e., facial and bodily cues), the social-cognitive component involves more complex cognitive inferences relating to the content of the mental state. The Mind in the Eyes test used in the present study provided a sensitive indicator of the social-perceptual component, involving immediate attribution of mental state, but did not impose demands on the social-cognitive component (i.e., successful task performance was not contingent on the ability to make inferences about why a particular mental state was experienced). While social-perceptual mental state attribution is an important component of ToM (see, e.g., Baron-Cohen et al., Reference Baron-Cohen, Wheelwright, Hill, Raste and Plumb2001; Tager-Flusberg, Reference Tager-Flusberg, Burack, Charman, Yirmiya and Zelano2001), it would be of considerable interest in future research to assess whether more complex social-cognitive aspects of ToM are also disrupted, such as coordination of multiple perspectives or the integration of mental states with behaviors.
The present results suggest that task difficulty may be relevant to understanding at least some MS-related difficulties decoding facial expressions of emotion and ToM, as both tasks were related to measures of cognitive ability. Consequently, it may be anticipated that more subtle facial expressions of different emotions (as are liable to occur in everyday interpersonal interactions) would prove even more difficult for participants with MS, and increase the magnitude of MS effects. Analogously, MS-related deficits may be greater on more complex social-cognitive aspects of ToM, relative to the social-perceptual aspects assessed in the present study. In future research that assesses the social-perceptual component of ToM, it would also be important to use a measure of this construct that does not require the ability to identify others’ emotions from facial information—as noted, in the present study, performance on both measures of emotion understanding were substantially related, particularly in the MS group. Disentangling these components would permit assessment of whether there are MS-related ToM deficits that are not simply secondary to difficulties with facial affect recognition. Given the importance of facial affect recognition and other emotion understanding skills for the development and maintenance of close interpersonal relationships (Carstensen et al., Reference Carstensen, Gross and Fung1998), the present data clearly highlight the need for future research that more fully delineates the extent and functional implications of MS-related deficits in this domain.
ACKNOWLEDGMENTS
This work is dedicated in memory of Professor William W. Beatty. The research was carried out with the financial support of University of New South Wales Faculty Research and Australian Research Council grants. The authors thank all of the participants who took part in this study and the staff at MS Australia ACT/NSW/VIC. The authors have no financial or other relationships that could be interpreted as a conflict of interest affecting this article.