Hostname: page-component-745bb68f8f-grxwn Total loading time: 0 Render date: 2025-02-11T11:49:50.861Z Has data issue: false hasContentIssue false

Relational mentalizing after any representation

Published online by Cambridge University Press:  19 November 2021

Eliane Deschrijver*
Affiliation:
Department of Experimental Psychology, Ghent University, Henri Dunantlaan 2, 9000, Ghent, Belgium School of Psychology, University of New South Wales (UNSW), Library Walk, Kensington, NSW2033, Australia. e.deschrijver@unsw.edu.au; www.elianedeschrijver.com

Abstract

Autistic, developmental, and nonhuman primate populations fail tasks that are thought to involve attributing beliefs, but not those thought to reflect the representation of knowledge. Instead of knowledge representations being more basic than belief representations, relational mentalizing may explain these observations: The tasks referred to as reflecting “belief” representation, but not the “knowledge” representation tasks, are social conflict designs. They involve mental conflict monitoring after another's mental state is represented – with effects that need to be accounted for.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2021. Published by Cambridge University Press

The ability to represent others' mental states has been thought of as social cognition's primary building block for over 40 years (Apperly, Reference Apperly2010; Premack & Woodruff, Reference Premack and Woodruff1978). Phillips et al. take this one step further in their impressive study of interdisciplinary thought: They argue that knowledge representation is a fundamentally different and more basic process than belief representation. This claim is based, in large part, on the observation that nonhuman primates (sect. 4.1 in Phillips et al.) as well as developmental (sect. 4.2) and clinical (sect. 4.4) populations show atypical behavior in false-belief tasks, despite their intact performance in tasks that may tap into the representation of knowledge.

A basic premise in order to accept the authors' argument is that atypical performance in a false-belief task arises from a lack of ability to represent another's belief (Dennett, Reference Dennett1978). But is this necessarily the case? We recently pointed out in our newly developed relational mentalizing framework (Deschrijver & Palmer, Reference Deschrijver and Palmer2020) that although passing a false-belief task is sufficient to conclude a belief representation ability to be present, failing one does not yield evidence for it to be absent. False-belief tasks are social conflict designs: The other typically holds a mental state that is manipulated to be irreconcilable with their own (e.g., they may think that the ball is in the basket, whereas you think it is in the box). In order to resolve the mental conflict that arises from having to represent both mental states, a neural mechanism may need to inhibit one of the competing representations before focusing on the other (i.e., mental conflict monitoring; Deschrijver & Palmer, Reference Deschrijver and Palmer2020). However, for example in a young child, if the mechanism fails to inhibit their own misaligned mental state despite being able to represent both states, it may manifest as an inability to verbalize or show sensitivity to the other's false belief. Even neurotypical adults, who undoubtedly represent others' mental states, can make errors in a false-belief task if they fail to suppress their own mental representation, suggesting that the mechanism indeed exists (Keysar, Lin, & Barr, Reference Keysar, Lin and Barr2003; for other evidence, see Deschrijver & Palmer, Reference Deschrijver and Palmer2020). Ineffective mental conflict monitoring may also be reflected in more interference by the other's belief if the task requires you to inhibit the representation of the other's mental state to focus on your own, as reported in some adults on the spectrum (Deschrijver, Bardi, Wiersema, & Brass, Reference Deschrijver, Bardi, Wiersema and Brass2016). Developmental (e.g., Kovács et al., Reference Kovács, Téglás and Endress2010), nonhuman primate (e.g., Martin & Santos, Reference Martin and Santos2014), and autistic (e.g., Deschrijver et al., Reference Deschrijver, Bardi, Wiersema and Brass2016) populations may thus show atypical results in false-belief tasks even while representing the other's belief.

The tasks identified by Phillips et al. as showing intact “knowledge” representation in young, autistic, and nonhuman primate populations broadly follow two methodological designs: First, they assess the understanding of the another's ignorance (e.g., Flombaum & Santos, Reference Flombaum and Santos2005; Luo & Johnson, Reference Luo and Johnson2009; Santos, Nissen, & Ferrugia, Reference Santos, Nissen and Ferrugia2006), meaning that the other does not have knowledge about the object of interest. Second, they investigate whether one understands that the other has knowledge (a mental state that is true), about the object's location with the observer either having or not having access to this other's knowledge (e.g., Behne, Liszkowski, Carpenter, & Tomasello, Reference Behne, Liszkowski, Carpenter and Tomasello2012; Luo & Johnson, Reference Luo and Johnson2009). To solve these tasks successfully, there is no need for the brain to engage in mental conflict monitoring regarding the object's location: There is no other-related mental state to be represented (and, therefore, it cannot clash with one's own), or the other's mental state aligns with the own. Populations that do represent others' mental states, but are unable to deal with mental conflict, should hence not experience any difficulties. Consistent with this, difficulties arise if the other's (unknown) knowledge starts misaligning with the own understanding of the world (e.g., Krachun, Carpenter, Call, & Tomasello, Reference Krachun, Carpenter, Call and Tomasello2009), and when the design temporarily involves a manipulation of conflict between the other's and the participant's understanding of reality (e.g., where the object is ostentatiously relocated without the other seeing it, but then put back; Fabricius, Boyer, Weimer, & Carroll, Reference Fabricius, Boyer, Weimer and Carroll2010; Gettier, Reference Gettier1963; Horschler, Santos, & MacLean, Reference Horschler, Santos and MacLean2019; Kaminski, Call, & Tomasello, Reference Kaminski, Call and Tomasello2008). What Phillips et al. consider to be a true belief design with representations that do not doesn't qualify “knowledge,” thus involves a short-term manipulation of mental conflict as well. This means that atypical results in such “true belief” tasks may be attributable to mental conflict monitoring rather than belief representation issues in these populations, too. To show that the distinction between representing another's belief versus knowledge consists of more than semantics, the field may thus need to show that populations such as young children, nonhuman primates, and individuals on the spectrum continue to perform well in “knowledge” tasks that do contain mental conflict (e.g., Samson et al., Reference Samson, Apperly, Braithwaite, Andrews and Bodley Scott2010; see Table 3 in Deschrijver & Palmer, Reference Deschrijver and Palmer2020, for the most optimal dependent measures), and fail in (true and false) “belief” tasks that don't. Mental conflict monitoring is also likely effortful, using cognitive resources and showing relationships with executive functions (Carlson, Reference Carlson2010; Carlson, Mandell, & Williams, Reference Carlson, Mandell and Williams2004a; Carlson & Moses, Reference Carlson and Moses2001; Carlson, Moses, & Breton, Reference Carlson, Moses and Breton2002; Carlson, Moses, & Claxton, Reference Carlson, Moses and Claxton2004b). This may result in seemingly less automatic or slower responses in false-belief designs versus those knowledge designs that do not involve mental conflict (see sects 4.3 and 5.3 in Phillips et al.). From all these arguments, it follows that a differential performance of autistic, developmental, and non-human primate populations in “belief” versus “knowledge'' tasks could be seen not as a consequence of these tasks tapping into two different types of representations that are differentially affected (i.e., ”knowledge“ versus ”beliefs"), but rather as an indication that these populations have issues with solving mental conflict instead of with attributing to others any representation per se.

Representing another's belief versus knowledge is thus not as easily dissociable as the authors may want to portray: If the two representation mechanisms are truly distinct, shouldn't one always be able to assess the truthfulness of another's mental state before representing it? This seems challenging in the real world, where others' mental states are more complex than the ones typically used in the mentalizing domain. Regardless of which party holds the facts, however, mental conflict may arise after representing another's position if it's misaligned. In sum, when taking into account mental conflict monitoring, the idea that the representation of beliefs and knowledge are two fundamentally different things, with one being more basic than the other, may be on shaky grounds.

Financial support

The author received funding from the Research Foundation Flanders – FWO (postdoctoral fellowship).

Conflict of interest

None.

References

Apperly, I. A. (2010). Mindreaders: The cognitive basis of “theory of mind.” Psychology Press. http://doi.org/10.4324/9780203833926.CrossRefGoogle Scholar
Behne, T., Liszkowski, U., Carpenter, M., & Tomasello, M. (2012). Twelve-month-olds’ comprehension and production of pointing. British Journal of Developmental Psychology, 30, 359375.CrossRefGoogle ScholarPubMed
Carlson, S. M. (2010). Developmentally sensitive measures of executive function in preschool children. Developmental Neuropsychology, 28, 3741. http://dx.doi.org/10.1207/s15326942dn2802_3.Google Scholar
Carlson, S. M., Mandell, D. J., & Williams, L. (2004a). Executive function and theory of mind: Stability and prediction from ages 2 to 3. Developmental Psychology, 40, 11051122. http://dx.doi.org/10.1037/0012-1649.40.6.1105.CrossRefGoogle Scholar
Carlson, S. M., & Moses, L. J. (2001). Individual differences in inhibitory control and children's theory of mind. Child Development, 72, 10321053. http://dx.doi.org/10.1111/1467-8624.00333.CrossRefGoogle ScholarPubMed
Carlson, S. M., Moses, L. J., & Breton, C. (2002). How specific is the relation between executive function and theory of mind? Contributions of inhibitory control and working memory. Infant and Child Development, 11, 7392. http://dx.doi.org/10.1002/icd.298.CrossRefGoogle Scholar
Carlson, S. M., Moses, L. J., & Claxton, L. J. (2004b). Individual differences in executive functioning and theory of mind: An investigation of inhibitory control and planning ability. Journal of Experimental Child Psychology, 87, 299319. http://dx.doi.org/10.1016/j.jecp.2004.01.002.CrossRefGoogle Scholar
Dennett, D. C. (1978). Beliefs about beliefs. Behavioral and Brain Sciences, 1, 568570. http://dx.doi.org/10.1017/S0140525X00076664.CrossRefGoogle Scholar
Deschrijver, E., Bardi, L., Wiersema, J. R., & Brass, M. (2016). Behavioral measures of implicit theory of mind in adults with high functioning autism. Cognitive Neuroscience, 7, 192202. http://dx.doi.org/10.1080/17588928.2015.1085375.CrossRefGoogle ScholarPubMed
Deschrijver, E., & Palmer, C. (2020). Reframing social cognition: Relational versus representational mentalizing. Psychological Bulletin, 146(11), 941969. https://doi.org/10.1037/bul0000302.CrossRefGoogle ScholarPubMed
Fabricius, W. V., Boyer, T. W., Weimer, A. A., & Carroll, K. (2010). True or false: Do 5-year olds understand belief? Developmental Psychology, 46(6), 1402.CrossRefGoogle ScholarPubMed
Flombaum, J. I., & Santos, L. R. (2005). Rhesus monkeys attribute perceptions to others. Current Biology, 15, 447452.CrossRefGoogle Scholar
Gettier, E. (1963). Is justified true belief knowledge? Analysis, 23(6), 121123.CrossRefGoogle Scholar
Horschler, D. J., Santos, L. R., & MacLean, E. L. (2019). Do non-human primates really represent others’ ignorance? A test of the awareness relations hypothesis. Cognition, 190, 7280.CrossRefGoogle ScholarPubMed
Kaminski, J., Call, J., & Tomasello, M. (2008). Chimpanzees know what others know, but not what they believe. Cognition, 109, 224234.CrossRefGoogle Scholar
Keysar, B., Lin, S., & Barr, D. J. (2003). Limits on theory of mind use in adults. Cognition, 89, 2541. http://dx.doi.org/10.1016/S0010-0277(03)00064-7.CrossRefGoogle ScholarPubMed
Kovács, Á. M., Téglás, E., & Endress, A. D. (2010). The social sense: Susceptibly to others' beliefs in human infants and adults. Science, 330, 18301834.CrossRefGoogle Scholar
Krachun, C., Carpenter, M., Call, J., & Tomasello, M. (2009). A competitive nonverbal false belief task for children and apes. Developmental Science, 12(4), 521535.CrossRefGoogle ScholarPubMed
Luo, Y., & Johnson, S. C. (2009). Recognizing the role of perception in action at 6 months. Developmental Science, 12(1), 142149.CrossRefGoogle ScholarPubMed
Martin, A., & Santos, L. R. (2014). The origins of belief representation: Monkeys fail to automatically represent others’ beliefs. Cognition, 130, 300308. http://dx.doi.org/10.1016/j.cognition.2013.11.016.CrossRefGoogle ScholarPubMed
Premack, D., & Woodruff, G. (1978). Does the chimpanzee have a theory of mind? Behavioral and Brain Sciences, 1, 515526. http://dx.doi.org/10.1017/S0140525X00076512.CrossRefGoogle Scholar
Samson, D., Apperly, I. A., Braithwaite, J. J., Andrews, B. J., & Bodley Scott, S. E. (2010). Seeing it their way: Evidence for rapid and involuntary computation of what other people see. Journal of Experimental Psychology. Human Perception and Performance, 36(5), 12551266. doi: 10.1037/a0018729.CrossRefGoogle ScholarPubMed
Santos, L. R., Nissen, A. G., & Ferrugia, J. (2006). Rhesus monkeys (Macaca mulatta) know what others can and cannot hear. Animal Behaviour, 71, 11751181.CrossRefGoogle Scholar