Issue compiled and edited by Hirohito M. Kondo, Jun-Ichiro Kawahara, Anouk M. van Loon and Brian C.J. Moore
Imagine you are walking on a big busy square. Cars are crossing, pedestrians are walking past and towards you, you hear people chatting, a taxi-driver shouting, and you notice a beautiful coloured tree. Our brain is very well equipped to rapidly convert such a mixture of sensory inputs – both visual and auditory – into coherent scenes so as to perceive meaningful objects and guide our navigation. This raises important questions regarding where and how 'scene analysis' is performed in the brain. Recent advances from both auditory and visual research suggest that the brain does not simply process the incoming scene properties. Rather, top-down processes such as attention, expectations, and prior knowledge facilitate scene perception.
This special issue covers novel advances in scene-analysis research obtained using a combination of psychophysics, computational modelling, neuroimaging, and neurophysiology, and presents new empirical and theoretical approaches. Moreover, this issue bridges the gap between sensory modalities by addressing both auditory and visual scene analysis, and includes studies of different species and of individual differences in humans.
This issue is available to read online.
This issue is available to buy in print.
We offer discounts for bulk orders of the print version of this journal issue for educational use (£20 per copy for 10 or more). Please contact our sales team for more information.