Causality, compositionality, and learning-to-learn – the future goals for artificial intelligence articulated by Lake et al. – are central for human learning. But these abilities alone would not be enough to avoid frostbite on King William Island in the Arctic Archipelago. You need to know how to hunt seals, make skin clothing, and manage dog sleds, and these skills are not easy to acquire from the environment alone. But if the Netsilik Inuit people taught them to you, your chances of surviving a winter would be dramatically improved (Lambert Reference Lambert2011). Similar to a human explorer, an artificial intelligence (AI) learning to play video games like Frostbite should take advantage of the rich knowledge available from other people. Access to this knowledge requires the capacity for social learning, both a critical prerequisite for language use and a gateway in itself to cumulative cultural knowledge.
Learning from other people helps you learn with fewer data. In particular, humans learn effectively even from “small data” because the social context surrounding the data is itself informative. Dramatically different inferences can result from what is ostensibly the same data in distinct social contexts or even with alternative assumptions about the same context (Shafto et al. Reference Shafto, Goodman and Frank2012). The flexibility of the social inference machinery in humans turns small signals into weighty observations: Even for young children, ambiguous word-learning events become informative through social reasoning (Frank & Goodman Reference Frank and Goodman2014), non-obvious causal action sequences become “the way you do it” when presented pedagogically (Buchsbaum et al. Reference Buchsbaum, Gopnik, Griffiths and Shafto2011), and complex machines can become single-function tools when a learner is taught just one function (Bonawitz et al. Reference Bonawitz, Shafto, Gweon, Goodman, Spelke and Schulz2011).
Learning from others comes in many forms. An expert may tolerate onlookers, a demonstrator may slow down when completing a particularly challenging part of the task, and a teacher may actively provide pedagogical examples and describe them with language (Csibra & Gergely Reference Csibra and Gergely2009; Kline Reference Kline2015). Informative demonstrations may be particularly useful for procedural learning (e.g., hunting seals, learning to play Frostbite). Language, however, is uniquely powerful in its ability to convey information that is abstract or difficult to observe, or information that otherwise does not have a way of being safely acquired such as learning that certain plants are poisonous or how to avoid frostbite (Gelman Reference Gelman2009). Studying social learning is an important part of studying language learning (Goodman & Frank Reference Goodman and Frank2016); both should be top priorities for making AIs learn like people.
Focusing on Lake et al.'s key example, you can even learn the game Frostbite with fewer data when you learn it from other people. We recruited 20 participants from Amazon's Mechanical Turk to play Frostbite for 5 minutes. Half of the participants were given written instructions about the abstract content of the game, adapted directly from the caption of Figure 2 in the target article. The other half were not given this information. (Everybody was told that you move the agent using the arrow keys.) Learners who were told about the abstract structure of the game learned to play the game more quickly and achieved higher overall scores (M = 2440) than the group without written instructions (M = 1333) (Figure 1). The highest score for those without linguistic instructions was 3530 points, achieved after about 4 minutes of play. By comparison, the highest score achieved with linguistic instructions was 7720 points, achieved after 2 minutes of play. Indeed, another group (including some authors of the target article) recently found a similar pattern of increased performance in Frostbite as a result of social guidance (Tsividis et al. Reference Tsividis, Pouncy, Xu, Tenenbaum and Gershman2017).
Figure 1. Score trajectories for players in the game Frostbite over time. The two panels depict results with and without instructions on the abstract structure of the game.
Learning from others also does more than simply “speed up” learning about the world. Human knowledge seems to accumulate across generations, hence permitting progeny to learn in one lifetime what no generation before them could learn (Boyd et al., Reference Boyd, Richerson and Henrich2011; Tomasello, Reference Tomasello1999). We hypothesize that language – and particularly its flexibility to refer to abstract concepts – is key to faithful transmission of knowledge, between individuals and through generations. Human intelligence is so difficult to match because we stand on the shoulders of giants. AIs need to “ratchet” up their own learning, by communicating knowledge efficiently within and across generations. Rather than be subject to a top-down hive mind, intelligent agents should retain their individual intellectual autonomy, and innovate new solutions to problems based on their own experience and what they have learned from others. The important discoveries of a single AI could then be shared, and we believe language is the key to this kind of cultural transmission. Cultural knowledge could then accumulate within both AI and human networks.
In sum, learning from other people should be a high priority for AI researchers. Lake et al. hope to set priorities for future research in AI, but fail to acknowledge the importance of learning from language and social cognition. This is a mistake: The more complex the task is, the more learning to perform like a human involves learning from other people.
Causality, compositionality, and learning-to-learn – the future goals for artificial intelligence articulated by Lake et al. – are central for human learning. But these abilities alone would not be enough to avoid frostbite on King William Island in the Arctic Archipelago. You need to know how to hunt seals, make skin clothing, and manage dog sleds, and these skills are not easy to acquire from the environment alone. But if the Netsilik Inuit people taught them to you, your chances of surviving a winter would be dramatically improved (Lambert Reference Lambert2011). Similar to a human explorer, an artificial intelligence (AI) learning to play video games like Frostbite should take advantage of the rich knowledge available from other people. Access to this knowledge requires the capacity for social learning, both a critical prerequisite for language use and a gateway in itself to cumulative cultural knowledge.
Learning from other people helps you learn with fewer data. In particular, humans learn effectively even from “small data” because the social context surrounding the data is itself informative. Dramatically different inferences can result from what is ostensibly the same data in distinct social contexts or even with alternative assumptions about the same context (Shafto et al. Reference Shafto, Goodman and Frank2012). The flexibility of the social inference machinery in humans turns small signals into weighty observations: Even for young children, ambiguous word-learning events become informative through social reasoning (Frank & Goodman Reference Frank and Goodman2014), non-obvious causal action sequences become “the way you do it” when presented pedagogically (Buchsbaum et al. Reference Buchsbaum, Gopnik, Griffiths and Shafto2011), and complex machines can become single-function tools when a learner is taught just one function (Bonawitz et al. Reference Bonawitz, Shafto, Gweon, Goodman, Spelke and Schulz2011).
Learning from others comes in many forms. An expert may tolerate onlookers, a demonstrator may slow down when completing a particularly challenging part of the task, and a teacher may actively provide pedagogical examples and describe them with language (Csibra & Gergely Reference Csibra and Gergely2009; Kline Reference Kline2015). Informative demonstrations may be particularly useful for procedural learning (e.g., hunting seals, learning to play Frostbite). Language, however, is uniquely powerful in its ability to convey information that is abstract or difficult to observe, or information that otherwise does not have a way of being safely acquired such as learning that certain plants are poisonous or how to avoid frostbite (Gelman Reference Gelman2009). Studying social learning is an important part of studying language learning (Goodman & Frank Reference Goodman and Frank2016); both should be top priorities for making AIs learn like people.
Focusing on Lake et al.'s key example, you can even learn the game Frostbite with fewer data when you learn it from other people. We recruited 20 participants from Amazon's Mechanical Turk to play Frostbite for 5 minutes. Half of the participants were given written instructions about the abstract content of the game, adapted directly from the caption of Figure 2 in the target article. The other half were not given this information. (Everybody was told that you move the agent using the arrow keys.) Learners who were told about the abstract structure of the game learned to play the game more quickly and achieved higher overall scores (M = 2440) than the group without written instructions (M = 1333) (Figure 1). The highest score for those without linguistic instructions was 3530 points, achieved after about 4 minutes of play. By comparison, the highest score achieved with linguistic instructions was 7720 points, achieved after 2 minutes of play. Indeed, another group (including some authors of the target article) recently found a similar pattern of increased performance in Frostbite as a result of social guidance (Tsividis et al. Reference Tsividis, Pouncy, Xu, Tenenbaum and Gershman2017).
Figure 1. Score trajectories for players in the game Frostbite over time. The two panels depict results with and without instructions on the abstract structure of the game.
Learning from others also does more than simply “speed up” learning about the world. Human knowledge seems to accumulate across generations, hence permitting progeny to learn in one lifetime what no generation before them could learn (Boyd et al., Reference Boyd, Richerson and Henrich2011; Tomasello, Reference Tomasello1999). We hypothesize that language – and particularly its flexibility to refer to abstract concepts – is key to faithful transmission of knowledge, between individuals and through generations. Human intelligence is so difficult to match because we stand on the shoulders of giants. AIs need to “ratchet” up their own learning, by communicating knowledge efficiently within and across generations. Rather than be subject to a top-down hive mind, intelligent agents should retain their individual intellectual autonomy, and innovate new solutions to problems based on their own experience and what they have learned from others. The important discoveries of a single AI could then be shared, and we believe language is the key to this kind of cultural transmission. Cultural knowledge could then accumulate within both AI and human networks.
In sum, learning from other people should be a high priority for AI researchers. Lake et al. hope to set priorities for future research in AI, but fail to acknowledge the importance of learning from language and social cognition. This is a mistake: The more complex the task is, the more learning to perform like a human involves learning from other people.