Matter becomes mind when sculpted into a computer. To understand what our minds are doing, then, we need to understand the computations they perform. The target article by Pietraszewski illustrates the clarity won by computational theories, using the example of group conflict. I provide a complementary example using group cooperation.
As Pietraszewski points out, researchers often get caught up with the definitions that their methods make obvious. Consider research on group cooperation that uses the public goods game. The game reveals the tension between choosing what's good for your group and choosing what's good for you. Players start with a stake of real money that they can contribute to a collective fund. Contributions are multiplied; for instance, every $1 contributed becomes $2 in the fund. The fund is then divided equally among the players regardless of whether a player contributed. This simulates how people create greater benefits when they cooperate and how they can exploit one another. This is because contributing everything to the fund is collectively best, but each individual is even better off if they free ride on others' contributions. Cooperation won't last long if free riders persist, so people need to catch them and change their behavior or exclude them from the group.
When researchers use this game, they define a free rider as a player who (a) contributes less than others while (b) taking collective benefits. Using a computational lens, I've shown that neither (a) nor (b) is necessary or sufficient for the mind to categorize a person as a free rider; how the mind computes is more subtle (Delton, Cosmides, Guemo, Robertson, & Tooby, Reference Delton, Cosmides, Guemo, Robertson and Tooby2012; Delton, Nemirow, Robertson, Cimino, & Cosmides, Reference Delton, Nemirow, Robertson, Cimino and Cosmides2013; Delton & Robertson, Reference Delton and Robertson2012; Delton & Sell, Reference Delton and Sell2014). First, contributing less, in and of itself, does not matter; if someone contributes less by accident, they are still viewed as a cooperator. Second, a person can still be categorized as a free rider even when contributing equally – so long as they wanted to exploit the group. Third, a person need not actually take collective benefits to be a free rider; the mere possibility that they might later exploit the group is enough. Fourth, some cooperation functions as mutual aid. In relationships of mutual aid, people can take collective benefits without contributing, such as during illness or injury (Gurven, Reference Gurven2004; Sugiyama, Reference Sugiyama2004); treating these people as free riders would defeat the purpose of social insurance. The mind's definition of free rider does not neatly map onto definitions based on typical games. The computational theories Pietraszewski champions reveal what the mind is really up to.
Despite my enthusiasm for Pietraszewski's approach, perhaps he is being too hard on past ideas of what makes a group to the mind. I agree the containment metaphor fails. But it's less obvious that obligations or interdependence are poor theories when fleshed out beyond one-word summaries (Balliet, Tybur, & Van Lange, Reference Balliet, Tybur and Van Lange2017). Pietraszewski argues that these ideas rely too much on intuition. His own theory, however, uses an intuitive idea: the concept of cost. Costs are not out there in the world; the mind must compute them. Daniel Kahneman and Amos Tversky famously pointed out the special psychology of costs and losses (e.g., Tversky & Kahneman, Reference Tversky and Kahneman1992). Even the most obvious cost – damaging the body – is not straightforward: If someone punctures your skin, isn't that a cost? Not when they're a doctor injecting a vaccine or palpitating a stopped heart. True, the needle and scalpel hurt, but they are unpleasant means to life-giving ends. One clue that something is a cost is that we get angry at the person who inflicts it (for a computational theory of anger and costs, see Sell et al., Reference Sell, Sznycer, Al-Shawaf, Lim, Krauss, Feldman and Tooby2017). A patient doesn't get angry at the doctor who saved their life, even if at knifepoint. We lack a complete theory of how the mind defines costs (or anything else!). Given our ignorance, every computational theory leans on a few black boxes. Is a black box for costs much different than failing to have a complete theory of obligations or interdependence?
Finally, Pietraszewski wavers on whether conflict is required to make a group. Usually, he writes that he is only talking about groups-in-conflict; other times he seems to be referring to groups full stop. For instance, his lists “ancillary attributes” of groups – features that often go along with groups but do not define a group in the mind. Ancillary attributes include working together or sharing interests. I'm not sure such attributes are ancillary. What if there were groups without group conflict? Imagine a fantasy land where the only evolved function of groups is mutual aid. Even here, I would wager, the mind would still evolve the ability to see groups out in the world. If Jenny, Claire, and Raj help one another during illness – bringing food, paying bills, and so on – others would find it useful to know about this relationship. Although relationships of mutual aid have boundaries (there's that containment metaphor), their function is not to fight people outside the group but to help people inside it. Humans did evolve to compete in groups over status and resources; elsewhere I've argued, like Pietraszewski, that the mind evolved concepts that enable this competition (Cimino & Delton, Reference Cimino and Delton2010; Delton & Cimino, Reference Delton and Cimino2010; Delton, Kane, Petersen, Robertson, & Cosmides, Reference Delton, Kane, Petersen, Robertson and Cosmides2021; Delton & Krasnow, Reference Delton and Krasnow2017; Delton, Petersen, & Robertson, Reference Delton, Petersen and Robertson2018). Human conflict often involves groups, groups often get into conflict, but groups and conflict are not identical.
Matter becomes mind when sculpted into a computer. To understand what our minds are doing, then, we need to understand the computations they perform. The target article by Pietraszewski illustrates the clarity won by computational theories, using the example of group conflict. I provide a complementary example using group cooperation.
As Pietraszewski points out, researchers often get caught up with the definitions that their methods make obvious. Consider research on group cooperation that uses the public goods game. The game reveals the tension between choosing what's good for your group and choosing what's good for you. Players start with a stake of real money that they can contribute to a collective fund. Contributions are multiplied; for instance, every $1 contributed becomes $2 in the fund. The fund is then divided equally among the players regardless of whether a player contributed. This simulates how people create greater benefits when they cooperate and how they can exploit one another. This is because contributing everything to the fund is collectively best, but each individual is even better off if they free ride on others' contributions. Cooperation won't last long if free riders persist, so people need to catch them and change their behavior or exclude them from the group.
When researchers use this game, they define a free rider as a player who (a) contributes less than others while (b) taking collective benefits. Using a computational lens, I've shown that neither (a) nor (b) is necessary or sufficient for the mind to categorize a person as a free rider; how the mind computes is more subtle (Delton, Cosmides, Guemo, Robertson, & Tooby, Reference Delton, Cosmides, Guemo, Robertson and Tooby2012; Delton, Nemirow, Robertson, Cimino, & Cosmides, Reference Delton, Nemirow, Robertson, Cimino and Cosmides2013; Delton & Robertson, Reference Delton and Robertson2012; Delton & Sell, Reference Delton and Sell2014). First, contributing less, in and of itself, does not matter; if someone contributes less by accident, they are still viewed as a cooperator. Second, a person can still be categorized as a free rider even when contributing equally – so long as they wanted to exploit the group. Third, a person need not actually take collective benefits to be a free rider; the mere possibility that they might later exploit the group is enough. Fourth, some cooperation functions as mutual aid. In relationships of mutual aid, people can take collective benefits without contributing, such as during illness or injury (Gurven, Reference Gurven2004; Sugiyama, Reference Sugiyama2004); treating these people as free riders would defeat the purpose of social insurance. The mind's definition of free rider does not neatly map onto definitions based on typical games. The computational theories Pietraszewski champions reveal what the mind is really up to.
Despite my enthusiasm for Pietraszewski's approach, perhaps he is being too hard on past ideas of what makes a group to the mind. I agree the containment metaphor fails. But it's less obvious that obligations or interdependence are poor theories when fleshed out beyond one-word summaries (Balliet, Tybur, & Van Lange, Reference Balliet, Tybur and Van Lange2017). Pietraszewski argues that these ideas rely too much on intuition. His own theory, however, uses an intuitive idea: the concept of cost. Costs are not out there in the world; the mind must compute them. Daniel Kahneman and Amos Tversky famously pointed out the special psychology of costs and losses (e.g., Tversky & Kahneman, Reference Tversky and Kahneman1992). Even the most obvious cost – damaging the body – is not straightforward: If someone punctures your skin, isn't that a cost? Not when they're a doctor injecting a vaccine or palpitating a stopped heart. True, the needle and scalpel hurt, but they are unpleasant means to life-giving ends. One clue that something is a cost is that we get angry at the person who inflicts it (for a computational theory of anger and costs, see Sell et al., Reference Sell, Sznycer, Al-Shawaf, Lim, Krauss, Feldman and Tooby2017). A patient doesn't get angry at the doctor who saved their life, even if at knifepoint. We lack a complete theory of how the mind defines costs (or anything else!). Given our ignorance, every computational theory leans on a few black boxes. Is a black box for costs much different than failing to have a complete theory of obligations or interdependence?
Finally, Pietraszewski wavers on whether conflict is required to make a group. Usually, he writes that he is only talking about groups-in-conflict; other times he seems to be referring to groups full stop. For instance, his lists “ancillary attributes” of groups – features that often go along with groups but do not define a group in the mind. Ancillary attributes include working together or sharing interests. I'm not sure such attributes are ancillary. What if there were groups without group conflict? Imagine a fantasy land where the only evolved function of groups is mutual aid. Even here, I would wager, the mind would still evolve the ability to see groups out in the world. If Jenny, Claire, and Raj help one another during illness – bringing food, paying bills, and so on – others would find it useful to know about this relationship. Although relationships of mutual aid have boundaries (there's that containment metaphor), their function is not to fight people outside the group but to help people inside it. Humans did evolve to compete in groups over status and resources; elsewhere I've argued, like Pietraszewski, that the mind evolved concepts that enable this competition (Cimino & Delton, Reference Cimino and Delton2010; Delton & Cimino, Reference Delton and Cimino2010; Delton, Kane, Petersen, Robertson, & Cosmides, Reference Delton, Kane, Petersen, Robertson and Cosmides2021; Delton & Krasnow, Reference Delton and Krasnow2017; Delton, Petersen, & Robertson, Reference Delton, Petersen and Robertson2018). Human conflict often involves groups, groups often get into conflict, but groups and conflict are not identical.
Financial support
This research received no specific grant from any funding agency, commercial, or not-for-profit sectors.
Conflict of interest
None.