Leveraging Spiking Deep Neural Networks to Understand the Neural Mechanisms Underlying Selective Attention

Open Access
Authors
Publication date 04-2022
Journal Journal of Cognitive Neuroscience
Volume | Issue number 34 | 4
Pages (from-to) 655-674
Organisations
  • Faculty of Social and Behavioural Sciences (FMG) - Psychology Research Institute (PsyRes)
  • Faculty of Science (FNWI) - Swammerdam Institute for Life Sciences (SILS)
Abstract
Spatial attention enhances sensory processing of goalrelevant information and improves perceptual sensitivity. Yet, the specific neural mechanisms underlying the effects of spatial attention on performance are still contested. Here, we examine different attention mechanisms in spiking deep convolutional neural networks. We directly contrast effects of precision (internal noise suppression) and two different gain modulation mechanisms on performance on a visual search task with complex real-world images. Unlike standard artificial neurons, biological neurons have saturating activation functions, permitting implementation of attentional gain as gain on a neuron’s input or on its outgoing connection. We show that modulating the connection is most effective in selectively enhancing information processing by redistributing spiking activity and by introducing additional task-relevant information, as shown by representational similarity analyses. Precision only produced minor attentional effects in performance. Our results, which mirror empirical findings, show that it is possible to adjudicate between attention mechanisms using more biologically realistic models and natural stimuli.
Document type Article
Language English
Related dataset ModelEvaluation ModelAnalysis ModelTraining
Published at https://doi.org/10.1162/jocn_a_01819
Other links https://www.scopus.com/pages/publications/85125883070
Downloads
Permalink to this page
Back