Hostname: page-component-745bb68f8f-b95js Total loading time: 0 Render date: 2025-02-06T23:02:37.651Z Has data issue: false hasContentIssue false

Latent Semantic Analysis (LSA), a disembodied learning machine, acquires human word meaning vicariously from language alone

Published online by Cambridge University Press:  01 August 1999

Thomas K. Landauer
Affiliation:
Institute of Cognitive Science, University of Colorado at Boulder, Boulder, CO 80309 landauer@psych.colorado.edu psych-www.colorado.edu/faculty/landauer.html
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

The hypothesis that perceptual mechanisms could have more representational and logical power than usually assumed is interesting and provocative, especially with regard to brain evolution. However, the importance of embodiment and grounding is exaggerated, and the implication that there is no highly abstract representation at all, and that human-like knowledge cannot be learned or represented without human bodies, is very doubtful. A machine-learning model, Latent Semantic Analysis (LSA) that closely mimics human word and passage meaning relations is offered as a counterexample.

Type
Open Peer Commentary
Copyright
© 1999 Cambridge University Press