Psychological Development and Education ›› 2019, Vol. 35 ›› Issue (4): 504-512.doi: 10.16187/j.cnki.issn1001-4918.2019.04.14

Previous Articles    

The Characteristics and Mechanisms of Contextual Cueing Effect in Real-world Scenes

LIU Xingze1, CHEN Minghui1, CUI Jiawei2, ZHAO Guang1   

  1. 1. Brain and cognitive neuroscience research center, Liaoning Normal University, Dalian 116029;
    2. College of psychology, Liaoning Normal University, Dalian 116029
  • Published:2019-08-28

Abstract: The found of contextual cueing effect revealed a phenomenon that a stable configuration among objects could be utilized to improve the efficiency of visual search. This review compared and discussed the implicit leaning and guidance of classical contextual cueing effect with relevant theories of visual search in real-world scenes and summarize the nature of learning and the involved mechanisms and paradigms of contextual cueing effect in real-world scene. Further, the various incoming information that contained in real-world can be divided into the low-level saliency and semantic information by different cognitive processing. Although current researches concerned about the basic mechanisms of contextual cueing effect in the two dimensions in real-word scenes. However, there was still a lack of other aspects that could be verified in the future research, such as the detailed categories of various information that contributed to the learning of contextual cueing effect, and each of their reacting periods.

Key words: visual search, contextual cueing effect, real-world scenes, low-level saliency, semantic information

CLC Number: 

  • B844
Amano, K., & Foster, D. H. (2014). Influence of local scene color on fixation position in visual search. Journal of the Optical Society of America A-Optics Image Science & Vision, 31(4), 254-262.
Annac, E., Manginelli, A. A., Pollmann, S., Shi, Z., Müller, H. J., & Geyer, T. (2013). Memory under pressure:Secondary-task effects on contextual cueing of visual search. Journal of Vision, 13(13), 6-6.
Assumpção, L., Shi, Z., Zang, X., Müller, H. J., & Geyer, T. (2015). Contextual cueing:Implicit memory of tactile context facilitates tactile search. Attention Perception & Psychophysics, 77(4), 1212-1222.
Beesley, T., Vadillo, M. A., Pearson, D., & Shanks, D. R. (2015). Pre-exposure of repeated search configurations facilitates subsequent contextual cuing of visual search. Journal of Experimental Psychology Learning Memory & Cognition, 41(2), 348-362.
Beesley, T., Vadillo, M. A., Pearson, D., & Shanks, D. R. (2016). Configural learning in contextual cuing of visual search. Journal of Experimental Psychology Human Perception & Performance, 42(8), 1173-1185.
Brady, T. F., & Chun, M. M. (2007). Spatial constraints on learning in visual search:Modeling contextual cuing. Journal of Experimental Psychology Human Perception and Performance, 33(4), 798-815.
Brockmole, J. R., Castelhano, M. S., & Henderson, J. M. (2006). Contextual cueing in naturalistic scenes:Global and local contexts. Journal of Experimental Psychology:Learning Memory & Cognition, 32(4), 699-706.
Brockmole, J. R., Davoli, C. C., Abrams, R. A., & Witt, J. K. (2013). The world within reach:Effects of hand posture and tool use on visual cognition. Current Directions in Psychological Science, 22(1), 38-44.
Brockmole, J. R., & Henderson, J. M. (2006). Using real-world scenes as contextual cues for search. Visual Cognition, 13(1), 99-108.
Castelhano, M. S., & Heaven, C. (2010). The relative contribution of scene context and target features to visual search in scenes. Attention Perception & Psychophysics, 72(5), 1283-1297.
Chang, Y. W. (2016). Influence of the principle of least effort across disciplines. Scientometrics, 106(3), 1-17.
Chaumon, M., Drouet, V., & Tallonbaudry, C. (2008). Unconscious associative memory affects visual processing before 100 ms. Journal of Vision, 8(3), 1-10.
Chua, K. P., & Chun, M. M. (2003). Implicit scene learning is viewpoint dependent. Perception & Psychophysics, 65(1), 72-80.
Chun, M. M., & Jiang, Y. (1998). Contextual cueing:Implicit learning and memory of visual context guides spatial attention. Cognitive Psychology, 36(1), 28-71.
Draschkow, D., Wolfe, J. M., & Võ, M. L. (2014). Seek and you shall remember:Scene semantics interact with visual search to build better memories. Journal of Vision, 14(8), 10-10.
Eckstein, M. P., Drescher, B. A., & Shimozaki, S. S. (2006). Attentional cues in real scenes, saccadic targeting, and Bayesian priors. Psychological Science, 17(11), 973-980.
Ehinger, K. A., & Brockmole, J. R. (2008). The role of color in visual search in real-world scenes:Evidence from contextual cuing. Perception & Psychophysics, 70(7), 1366-1378.
Ehinger, K. A., Hidalgo-Sotelo, B., Torralba, A., & Oliva, A. (2009). Modeling search for people in 900 scenes:A combined source model of eye guidance. Visual Cognition, 17(6-7), 945-978.
Fernandes, H. L., Stevenson, I. H., Phillips, A. N., Segraves, M. A., & Kording, K. P. (2014). Saliency and saccade encoding in the frontal eye field during natural scene search. Cerebral Cortex, 24(12), 3232-3245.
Foulsham, T., Chapman, C., Nasiopoulos, E., & Kingstone, A. (2014). Top-down and bottom-up aspects of active search in a real-world environment. Canadian Journal of Experimental Psychology, 68(1), 8-19.
Geringswald, F., Baumgartner, F., & Pollmann, S. (2012). Simulated loss of foveal vision eliminates visual search advantage in repeated displays. Frontiers in Human Neuroscience, 6(2), 134-134.
Geringswald, F., Herbik, A., Hoffmann, M. B., & Pollmann, S. (2013). Contextual cueing impairment in patients with age-related macular degeneration. Journal of Vision, 13(3), 28-28.
Geyer, T., Zehetleitner, M., & Müller, H. J. (2010). Contextual cueing of pop-out visual search:when context guides the deployment of attention. Journal of Vision, 10(5), 20-20.
Glaholt, M. G., & Reingold, E. M. (2016). Perceptual enhancement as a result of a top-down attentional influence in a scene viewing task:Evidence from saccadic inhibition. Quarterly Journal of Experimental Psychology, 71(1), 56-63.
Goldfarb, E. V., Chun, M. M., & Phelps, E. A. (2016). Memory-guided attention:Independent contributions of the hippocampus and striatum. Neuron, 89(2), 317-324.
Goujon, A., Didierjean, A., & Marmèche, E. (2007). Contextual cueing based on specific and categorical properties of the environment. Visual Cognition, 15(3), 257-275.
Goujon, A., Didierjean, A., & Thorpe, S. (2015). Investigating implicit statistical learning mechanisms through contextual cueing. Trends in Cognitive Sciences, 19(9), 524-533.
Harris, A. M., & Remington, R. W. (2017). Contextual cueing improves attentional guidance, even when guidance is supposedly optimal. Journal of Experimental Psychology Human Perception & Performance, 43(5), 926-940.
Henderson, J. M., & Hollingworth, A. (1999). High-level scene perception. Annual Review of Psychology, 50(1), 243-271.
Henderson, J. M., Larson, C. L., & Zhu, D. C. (2007). Cortical activation to indoor versus outdoor scenes:an fMRI study. Experimental Brain Research, 179(1), 75-84.
Henderson, J. M., Malcolm, G. L., & Schandl, C. (2009). Searching in the dark:Cognitive relevance drives attention in real-world scenes. Psychonomic Bulletin & Review, 16(5), 850-856.
Higuchi, Y., Ueda, Y., Ogawa, H., & Saiki, J. (2016). Task-relevant information is prioritized in spatiotemporal contextual cueing. Attention Perception & Psychophysics, 78(8), 1-14.
Hollingworth, A. (2005). Memory for object position in natural scenes. Visual Cognition, 12(6), 1003-1016.
Hollingworth, A. (2005). The relationship between online visual representation of a scene and long-term scene memory. Journal of Experimental Psychology:Learning, Memory and Cognition, 31(3), 396-411.
Hollingworth, A. (2009). Two forms of scene memory guide visual search:Memory for scene context and memory for the binding of target object to scene location. Visual Cognition, 17(1-2), 273-291.
Hollingworth, A., & Henderson, J. M. (2000). Semantic informativeness mediates the detection of changes in natural scenes. Visual Cognition, 7(1-3), 213-235.
Hwang, A. D, Higgins, E. C, & Pomplun, M. (2007). How chromaticity guides visual search in real-world scenes. Proceedings of the Annual Meeting of the Cognitive Science Society, 29. Retrieved October 3, 2017, from https://escholarship.org/uc/item/3hg523m6.
Itti, L., & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40(12), 1489-1506.
Itti, L., & Koch, C. (2001). Feature combination strategies for saliency-based visual attention systems. Journal of Electronic Imaging, 10(1), 161-169.
Jiang, Y., & Chun, M. M. (2003). Contextual cueing:Reciprocal influences between attention and implicit learning. In L. Jiménez (Ed.), Attention and implicit learning (pp. 277-296). Amsterdam, NE:John Benjamins Publishing Co.
Jiang, Y. V., Sigstad, H. M., & Swallow, K. M. (2013). The time course of attentional deployment in contextual cueing. Psychonomic Bulletin & Review, 20(2), 282-288.
Jiang, Y. V., Won, B. Y., & Swallow, K. M. (2014). First saccadic eye movement reveals persistent attentional guidance by implicit learning. Journal of Experimental Psychology Human Perception & Performance, 40(3), 1161-1173.
Karthikeyan, S., Jagadeesh, V., & Manjunath, B. S. (2013, September). Learning top down scene context for visual attention modeling in natural images. Paper presented at the 20th IEEE International Conference on Image Processing (ICIP), Melbourne, VIC.
Kunar, M. A., Flusberg, S. J., & Wolfe, J. M. (2006). Contextual cuing by global features. Perception & Psychophysics, 68(7), 1204-1216.
Kunar, M. A., Watson, D. G., Cole, L., & Cox, A. (2014). Negative emotional stimuli reduce contextual cueing but not response times in inefficient search. Quarterly Journal of Experimental Psychology, 67(2), 377-393.
Li, M., Li, X., & Liu, Y. (2013). Spatial similarity retrieval of symbolic images with repeated symbols. Journal of Computational Information Systems, 9(9), 3627-3635.
Lu, Z., Li, X., & Meng, M. (2016). Encodings of implied motion for animate and inanimate object categories in the two visual pathways. Neuroimage, 125, 668-680.
Mack, S. C., & Eckstein, M. P. (2011). Object co-occurrence serves as a contextual cue to guide and facilitate visual search in a natural viewing environment. Journal of Vision, 11(9), 1-16.
Makovski, T. (2017). Meaning in learning:Contextual cueing relies on objects' visual features and not on objects' meaning. Memory & Cognition, 46(38), 1-10.
Manginelli, A. A., Baumgartner, F., & Pollmann, S. (2013). Dorsal and ventral working memory-related brain areas support distinct processes in contextual cueing. Neuroimage, 67(4), 363-374.
Neider, M. B., & Zelinsky, G. J. (2006). Scene context guides eye movements during visual search. Vision Research, 46(5), 614-621.
Nuthmann, A., & Malcolm, G. L. (2016). Eye guidance during real-world scene search:The role color plays in central and peripheral vision. Journal of Vision, 16(2), 3-3.
Ogawa, H., & Watanabe, K. (2011). Implicit learning increases preference for predictive visual display. Attention Perception & Psychophysics, 73(6), 1815-1822.
Pollmann, S., Eštoćinová, J., Sommer, S., Chelazzi, L., & Zinke, W. (2016). Neural structures involved in visual search guidance by reward-enhanced contextual cueing of the target location. Neuroimage, 124(Pt A), 887-897.
Pomplun, M., & Hwang, A. (2010). The dynamics of top-down and bottom-up control of visual attention during search in complex scenes. Journal of Vision, 10(7), 1275-1275.
Preston, T. J., Guo, F., Das, K., Giesbrecht, B., & Eckstein, M. P. (2013). Neural representations of contextual guidance in visual search of real-world scenes. Journal of Neuroscience, 33(18), 7846-7855.
Reeder, R. R., & Peelen, M. V. (2013). The contents of the search template for category-level search in natural scenes. Journal of Vision, 13(3), 13-13.
Rosenbaum, G. M., & Jiang, Y. V. (2013). Interaction between scene-based and array-based contextual cueing. Attention Perception & Psychophysics, 75(5), 888-899.
Schlagbauer, B., Müller, H. J., Zehetleitner, M., & Geyer, T. (2012). Awareness in contextual cueing of visual search as measured with concurrent access-and phenomenal-consciousness tasks. Journal of Vision, 12(11), 77-78.
Shanks, D. R. (2010). Learning:From association to cognition. Annual Review of Psychology, 61(1), 273-301.
Shioiri, S., Tsuchiai, T., Matsumiya, K., & Kuriki, I. (2012). Viewpoint dependent and independent contextual cuing effect. Journal of Vision, 12(9), 1078-1078.
Smyth, A. C., & Shanks, D. R. (2008). Awareness in contextual cuing with extended and concurrent explicit tests. Memory & Cognition, 36(2), 403-15.
Spotorno, S., Malcolm, G. L., & Tatler, B. W. (2014). How context information and target information guide the eyes from the first epoch of search in real-world scenes. Journal of Vision, 14(2), 386-389.
Tatler, B. W., Hayhoe, M. M., Land, M. F., & Ballard, D. H. (2011). Eye guidance in natural vision:reinterpreting salience. Journal of Vision, 11(5), 5-5.
Tseng, Y. C., & Lleras, A. (2013). Rewarding context accelerates implicit guidance in visual search. Attention Perception & Psychophysics, 75(2), 287-298.
Võ, M. L., & Wolfe, J. M. (2012). When does repeated search in scenes involve memory? Looking at versus looking for objects in scenes. Journal of Experimental Psychology Human Perception & Performance, 38(1), 23-41.
Võ, M. L., & Wolfe, J. M. (2013). The interplay of episodic and semantic memory in guiding repeated search in scenes. Cognition, 126(2), 198-212.
Wienrich, C., & Janczyk, M. (2011). Absence of attentional capture in parallel search is possible:a failure to replicate attentional capture in a non-singleton target search task. Attention Perception & Psychophysics, 73(7), 2044-2052.
Wolfe, J. M., & Horowitz, T. S. (2017). Five factors that guide attention in visual search. Nature Human Behaviour, 1(3), 0058.
Wu, X., Wang, C., Chen, W., & Liu, Z. (2015, June). Put things in correct location:Describing the scene with contextual cues. Paper presented at the meeting of the 2015 IEEE 10th Conference on Industrial Electronics and Applications (ICIEA), Auckland, CA.
Zang, X., Jia, L., Müller, H. J., & Shi, Z. (2015). Invariant spatial context is learned but not retrieved in gaze-contingent tunnel-view search. Journal of Experimental Psychology:Learning Memory and Cognition, 41(3), 807-819.
Zellin, M., Conci, M., Mühlenen, A. V., & Müller, H. J. (2011). Two (or three) is one too many:testing the flexibility of contextual cueing with multiple target locations. Attention Perception & Psychophysics, 73(7), 2065-2076.
Zellin, M., Mühlenen, A. V., Müller, H. J., & Conci, M. (2014). Long-term adaptation to change in implicit contextual learning. Psychonomic Bulletin & Review, 21(4), 1073-1079.
白学军, 王中婷, 李士一, 孙弘进. (2014). 真实场景中内隐的背景线索效应研究. 心理研究, 7(4), 16-21.
白学军, 魏玲, 沈德立. (2011). 背景线索对室内场景中目标搜索的注意引导. 心理学探新, 31(2), 122-127.
金君敏. (2017). 认知风格对变化觉察任务下背景线索效应的影响. 内江师范学院学报, 32(4), 12-18.
王英华. (2006). 物体位置和个体特征在人脑空间表征中的作用(硕士学位论文). 西南大学, 重庆.
魏玲. (2011). 真实场景中视觉搜索的背景线索效应(博士学位论文). 天津师范大学.
赵斐斐, 任衍具. (2013). 空间情境提示效应及其机制. 心理科学进展, 21(7), 1173-1185.
[1] PAN Yi, ZHANG Lin. Attentional Capture by Entirely Irrelevant Stimuli Driven by the Contents of Working Memory [J]. Psychological Development and Education, 2019, 35(5): 522-529.
[2] LEI Yuju, HE Jinbo, NIU Gengfeng, ZHOU Zongkui, TIAN Yuan. Impaired Disengagement from Negative Emotional Faces in Youth with Internet Addiction [J]. Psychological Development and Education, 2017, 33(6): 691-699.
[3] GE Ji-yan, GUO De-jun, WANG Zheng. The Detection of Angry Expression in Children Aged 13 to 15 Years [J]. Psychological Development and Education, 2005, 21(4): 34-39.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!