Hostname: page-component-745bb68f8f-f46jp Total loading time: 0 Render date: 2025-02-12T05:48:17.333Z Has data issue: false hasContentIssue false

Scanning movements during haptic search: similarity with fixations during visual search

Published online by Cambridge University Press:  24 May 2017

Achille Pasqualotto*
Affiliation:
Faculty of Arts and Social Sciences, Sabanci University, Tuzla 34956, Istanbul, Turkey. achille@sabanciuniv.edu

Abstract

Finding relevant objects through vision, or visual search, is a crucial function that has received considerable attention in the literature. After decades of research, data suggest that visual fixations are more crucial to understanding how visual search works than are the attributes of stimuli. This idea receives further support from the field of haptic search.

Type
Open Peer Commentary
Copyright
Copyright © Cambridge University Press 2017 

Finding objects in a complex environment is a crucial skill for the survival of humans and animals. Finding the suitable fruit in a cluttered forest, detecting predators disguised in the savanna, or simply finding a stapler on a cluttered office desk, all show the importance of the ability to find objects in an environment containing much irrelevant information (distractors). In most cases, and for humans in particular, this function is carried out by the visual modality, namely through visual search. Not surprisingly, visual search has been broadly investigated in humans and animals (Hickey et al. Reference Hickey, Chelazzi and Theeuwes2010; Proulx et al. Reference Proulx, Parker, Tahir and Brennan2014; Tomonaga Reference Tomonaga2007; Young & Hulleman Reference Young and Hulleman2013). Over the past several decades, theories of visual search have been constructed based on studies focusing on the number and the type of items used as distractors in visual search tasks (Duncan & Humphreys Reference Duncan and Humphreys1989; He & Nakayama Reference He and Nakayama1992; Treisman Reference Treisman1982). In other words, the characteristics of the items were considered the central feature in visual search.

However, towards the end of the 90s, empirical data began to suggest that, rather than the characteristics of the items, another feature might better explain how visual search works. This feature is the number of fixations produced by participants during visual search tasks (Wolfe Reference Wolfe1998b; Wolfe et al. Reference Wolfe, Palmer and Horowitz2010b; Zelinsky Reference Zelinsky1996). The more recent and mounting evidence in favour of the theory based on visual fixations prompted Hulleman & Olivers (H&O) to propose a novel theory to interpret past results and design future experiments. The authors conducted a meta-analysis to demonstrate how the number of fixations accurately predicts search times, and argued that the theory based on eye-fixations takes into account the physiology and the limits of the visual system. This led the authors to suggest that number of fixations would be better able to explain how visual search works than the number and type of the items. In fact, fixations are a constant factor (“fixations are fixations”), whereas items can be objects, faces, letters, numbers, and so forth, making experimental results highly task-dependent. This implies that studying fixation patterns would facilitate the comparison of results across different studies, and perhaps lead to a more comprehensive understanding of the underlying mechanisms of visual search.

This innovative approach is, however, limited to visual search. In real world, object finding could depend upon, and can be performed through, other modalities such as the haptic modality; for example, finding the keys in a pocket, finding a torch during a blackout, and so forth. The haptic modality consists of touch as well as proprioceptive and kinaesthetic cues (Gibson Reference Gibson1962; Heller Reference Heller1984), and is very well suited to conveying information regarding shapes and positions (plus “extra” information such as temperature) of external objects (Lacreuse & Fragaszy Reference Lacreuse and Fragaszy1997; Lederman & Klatzky Reference Lederman and Klatzky1987; Sann & Streri Reference Sann and Streri2007). Interestingly, haptics and vision are broadly interconnected at the behavioural level (Ballesteros et al. Reference Ballesteros, Millar and Reales1998; Grabowecky et al. Reference Grabowecky, List, Iordanescu and Suzuki2011; Newell et al. Reference Newell, Woods, Mernagh and Bülthoff2005; Pasqualotto et al. Reference Pasqualotto, Finucane and Newell2005; Reference Pasqualotto, Finucane and Newell2013a), and the two sensory modalities are strongly interlinked in the brain (Lacey et al. Reference Lacey, Tal, Amedi and Sathian2009; Pasqualotto et al. Reference Pasqualotto, Proulx and Sereno2013b; Pietrini et al. Reference Pietrini, Furey, Ricciardi, Gobbini, Wu, Cohen, Guazzelli and Haxby2004). Therefore, it seems reasonable to assume that, if visual search is based on visual fixations, haptic search might be based on hand/finger scanning movements.

Haptic search has been far less studied than visual search; nevertheless, both early and recent evidence suggests that scanning movements are as pivotal for haptics (Lederman & Klatzky Reference Lederman and Klatzky1987; Overvliet et al. Reference Overvliet, Smeets and Brenner2007; Plaisier et al. Reference Plaisier, Tiest and Kappers2008; Reference Plaisier, Kuling, Tiest and Kappers2009; Van Polanen et al. Reference Van Polanen, Tiest and Kappers2011) as fixations are for vision, thus supporting the assumption that the two modalities are strongly interlinked (see the previous paragraph). For example, in a haptic search task where participants had to find a target object among several distractors through active touch, haptic scanning patterns resembled patterns of eye-fixations observed in visual search (Overvliet et al. Reference Overvliet, Smeets and Brenner2007). A similar study reported that, rather than the number and type of distractors, the strategy of haptic exploration (i.e., using one finger, one hand, or two hands) was the crucial factor that influenced haptic search performance (Overvliet et al. Reference Overvliet, Smeets and Brenner2008). The same study also reported that systematic “zig-zag” haptic scanning patterns were observed across different searching conditions (e.g., different target objects), which suggests that scanning movements reflect a fundamental feature of haptic search. Furthermore, another study found that the “pop-out” effect well documented in the literature of vision (e.g., detecting a red object amongst blue objects) also occurs for haptics, and that haptic scanning movements were the best predictor for haptic search performance (Plaisier et al. Reference Plaisier, Tiest and Kappers2008). The similarity between visual and haptic search is consistent with theories such as neural reuse (Anderson Reference Anderson2010) and metamodal brain (Pascual-Leone & Hamilton Reference Pascual-Leone and Hamilton2001), both of which suggest a substantial overlap in the brain across the areas processing input from different sensory modalities (Pasqualotto et al. Reference Pasqualotto, Dumitru and Myachykov2016; Uesaki & Ashida Reference Uesaki and Ashida2015).

Although the novel approach to understanding the underlying mechanisms of visual search proposed in this article is supported by evidence from haptic search, the number of studies on haptic search is still very limited compared to that on visual search, thus more research is needed to confirm those initial findings. In particular, it is critical to compare the patterns of visual fixations and haptic scanning movements arising within the same experimental setup directly to achieve a more holistic understanding of visual search as well as object search through other sensory modalities.

References

Anderson, M. L. (2010) Neural reuse: A fundamental organizational principle of the brain. Behavioral and Brain Sciences 33:245–66.Google Scholar
Ballesteros, S., Millar, S. & Reales, J. M. (1998) Symmetry in haptic and in visual shape perception. Perception and Psychophysics 60:389404.Google Scholar
Duncan, J. & Humphreys, G. W. (1989) Visual search and stimulus similarity. Psychological Review 96:433–58. doi: 10.1037/0033-295X.96.3.433.Google Scholar
Gibson, J. J. (1962) Observations on active touch. Psychological Review 69:477–91.Google Scholar
Grabowecky, M., List, A., Iordanescu, L. & Suzuki, S. (2011) Haptic influence on visual search. i-Perception 2:824.Google Scholar
He, Z. J. & Nakayama, K. (1992) Surfaces versus features in visual search. Nature 359:231–33.Google Scholar
Heller, M. A. (1984) Active and passive touch: The influence of exploration time on form recognition. Journal of General Psychology 110:243–49.CrossRefGoogle ScholarPubMed
Hickey, C., Chelazzi, L. & Theeuwes, J. (2010) Reward has a larger impact on visual search in people with reward-seeking personalities. Journal of Vision 10:255–55.Google Scholar
Lacey, S., Tal, N., Amedi, A. & Sathian, K. (2009) A putative model of multisensory object representation. Brain Topography 21:269–74.Google Scholar
Lacreuse, A. & Fragaszy, D. M. (1997) Manual exploratory procedures and asymmetries for a haptic search task: A comparison between capuchins (Cebus apella) and humans. Laterality: Asymmetries of Body, Brain and Cognition 2:247–66.Google Scholar
Lederman, S. J. & Klatzky, R. L. (1987) Hand movements: A window into haptic object recognition. Cognitive Psychology 19:342–68.CrossRefGoogle ScholarPubMed
Newell, F. N., Woods, A. T., Mernagh, M. & Bülthoff, H. H. (2005) Visual, haptic, and crossmodal recognition of scenes. Experimental Brain Research 161:233–42.Google Scholar
Overvliet, K. E., Smeets, J. B. & Brenner, E. (2007) Haptic search with finger movements: Using more fingers does not necessarily reduce search times. Experimental Brain Research 182:427–34.CrossRefGoogle Scholar
Overvliet, K. E., Smeets, J. B. & Brenner, E. (2008) The use of proprioception and tactile information in haptic search. Acta Psychologica 129:8390.Google Scholar
Pascual-Leone, A. & Hamilton, R. (2001) The metamodal organization of the brain. Progress in Brain Research 134:427–45.CrossRefGoogle ScholarPubMed
Pasqualotto, A., Dumitru, M. L. & Myachykov, A. (2016) Multisensory integration: Brain, body, and world. Frontiers in Psychology 6:2046.Google Scholar
Pasqualotto, A., Finucane, C. M. & Newell, F. N. (2005) Visual and haptic representations of scenes are updated with observer movement. Experimental Brain Research 166:481–88.CrossRefGoogle ScholarPubMed
Pasqualotto, A., Finucane, C. M. & Newell, F. N. (2013a) Ambient visual information confers a context-specific, long-term benefit on memory for haptic scenes. Cognition 128:363–79.Google Scholar
Pasqualotto, A., Proulx, M. J. & Sereno, M. I. (2013b) Visual experience and the establishment of tactile face-maps in the brain. Perception 39:54.Google Scholar
Pietrini, P., Furey, M. L., Ricciardi, E., Gobbini, M. I., Wu, W. H. C., Cohen, L. G., Guazzelli, M. & Haxby, J. V. (2004) Beyond sensory images: Object-based representation in the human ventral pathway. Proceedings of the National Academy of Sciences of the United States of America 101:5658–63.Google Scholar
Plaisier, M., Kuling, I., Tiest, W. M. B. & Kappers, A. M. (2009) The role of item fixation in haptic search. In: World Haptics 2009 – Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Salt Lake City, UT, March 2009, pp. 417–21. IEEE. doi: 10.1109/WHC.2009.4810798.Google Scholar
Plaisier, M. A., Tiest, W. M. B. & Kappers, A. M. (2008) Haptic pop-out in a hand sweep. Acta Psychologica 128:368–77.Google Scholar
Proulx, M. J., Parker, M. O., Tahir, Y. & Brennan, C. H. (2014) Parallel mechanisms for visual search in zebrafish. PLoS ONE 9:e111540.CrossRefGoogle ScholarPubMed
Sann, C. & Streri, A. (2007) Perception of object shape and texture in human newborns: Evidence from cross-modal transfer tasks. Developmental Science 10:399410.Google Scholar
Tomonaga, M. (2007) Visual search for orientation of faces by a chimpanzee (Pan troglodytes): Face-specific upright superiority and the role of facial configural properties. Primates 48:112.Google Scholar
Treisman, A. (1982) Perceptual grouping and attention in visual search for features and for objects. Journal of Experimental Psychology: Human Perception and Performance 8:194214.Google Scholar
Uesaki, M. & Ashida, H. (2015) Optic-flow selective cortical sensory regions associated with self-reported states of vection. Frontiers in Psychology 6:775.Google Scholar
Van Polanen, V., Tiest, W. M. B. & Kappers, A. M. (2011) Movement strategies in a haptic search task. In: World Haptics Conference, 2011 IEEE, 275–80.CrossRefGoogle Scholar
Wolfe, J. M. (1998b) What can 1 million trials tell us about visual search? Psychological Science 9:3339. doi: 10.1111/1467-9280.00006.CrossRefGoogle Scholar
Wolfe, J. M., Palmer, E. M. & Horowitz, T. S. (2010b) Reaction time distributions constrain models of visual search. Vision Research 50:1304–11.Google Scholar
Young, A. H. & Hulleman, J. (2013) Eye movements reveal how task difficulty moulds visual search. Journal of Experimental Psychology: Human Perception and Performance 39:168–90.Google Scholar
Zelinsky, G. J. (1996) Using eye saccades to assess the selectivity of search movements. Vision Research 36:2177–21.Google Scholar