Research

Object-level scene descriptions and attention in visual search

Exploring visual scene analysis in humans and assessing its potential for use in computer vision systems

Research Unit: 1

Project Number: 1

Example Behavior:
Individual Intelligence

Disciplines:
Computer Vision
Machine Learning
Psychology

 

 

Expected Project Duration
2019 - 2024


← Projects Overview

Object-level scene descriptions and attention in visual search

Exploring visual scene analysis in humans and assessing its potential for use in computer vision systems

Photo: Unsplash.com/Freddy-Marschall

This project focuses on the potential of object-level search processes for understanding visual scene analysis in humans and assesses their potential for real-world scene interpretation and search tasks by computer vision systems. In order to perform this research, we will study an ecologically valid setting where free-viewing mode, real-world dynamic scenes, and visual exploration tasks that have to be performed simultaneously. This will provide novel insights into the question of how human subjects balance attentional processing under – potentially conflicting – task demands. To this end, we will combine eye-tracking experiments in humans with the computational modelling of fixation sequences, with algorithm development, and with the design of computer vision systems using foveated attention and camera movements akin to eye movements.


Wirth, L., Shurygina, O., Rolfs, M., & Ohl, S. (2022). Target-location rather than target-object specific saccadic selection in visual working memory. Perception/ ECVP2022.
Shurygina, O., Pooresmaeili, A., & Rolfs, M. (2021). Pre-saccadic attention spreads to stimuli forming a perceptual group with the saccade target. Cortex, 140, 179–198. https://doi.org/10.1016/j.cortex.2021.03.020
Shurygina, O., & Rolfs, M. (2021). Visual sensitivity and reaction time measures show no evidence for purely exogenous object-based attention. Journal of Vision / VSS 2021, 2571–2571. https://doi.org/10.1167/jov.21.9.2571
Shurygina, O., & Rolfs, M. (2022). Saccade kinematics reflect object-based attention in realistic but not in simplified stimuli. Perception/ ECVP2022. https://osf.io/x7wpj
Shurygina, O., & Rolfs, M. (2022). Eye movement characteristics reflect object-based attention. Journal of Vision / VSS 2022, 3481–3481. https://doi.org/10.1167/jov.22.14.3481
Schmittwilken, L., Matic, M., Maertens, M., & Vincent, J. (2021). BRENCH: An open-source framework for b(r)enchmarking brightness models. Journal of Vision. https://doi.org/10.1167/jov.22.3.36
Schmittwilken, L., & Maertens, M. (2022). Medium spatial frequencies mask edges most effectively. Journal of Vision. https://doi.org/10.13140/RG.2.2.11382.06726
Schmittwilken, L., & Maertens, M. (2022). Fixational eye movements enable robust edge detection. Journal of Vision. https://doi.org/10.1167/jov.22.8.5
Roth, N., Rolfs, M., & Obermayer, K. (2022). Scanpath prediction in dynamic real-world scenes based on object-based selection. Journal of Vision / VSS 2022. https://doi.org/10.1167/jov.22.14.4217
Roth, N., Rolfs, M., & Obermayer, K. (2022). ScanDy: Simulating Realistic Human Scanpaths in Dynamic Real-World Scenes. MODVIS 2022. https://docs.lib.purdue.edu/modvis/2022/session01/6/
Roth, N., Rolfs, M., Hellwich, O., & Obermayer, K. (2023). Objects guide human gaze behavior in dynamic real-world scenes. PLOS Computational Biology. https://doi.org/10.1371/journal.pcbi.1011512
Roth, N., Bideau, P., Hellwich, O., Rolfs, M., & Obermayer, K. (2021). Modeling the influence of objects on saccadic decisions in dynamic real-world scenes. PERCEPTION / 43rd European Conference on Visual Perception (ECVP) 2021. https://journals.sagepub.com/doi/full/10.1177/03010066211059887
Roth, N., McLaughlin, J., Obermayer, K., & Rolfs, M. (2023). Looking for potential action: Differences in exploration behavior of static and (potentially) dynamic scenes. Journal of Vision. https://doi.org/10.1167/jov.23.9.5293
Roth, N., Bideau, P., Hellwich, O., Rolfs, M., & Obermayer, K. (2021). A modular framework for object-based saccadic decisions in dynamic scenes. CVPR EPIC Workshop / arXiv:2106.06073. https://doi.org/10.48550/arXiv.2106.06073
Muscinelli, F., Roth, N., Shurygina, O., Obermayer, K., & Rolfs, M. (2022). Object-based Spread of Attention Affects Fixation Duration During Free Viewing. Perception/ ECVP2022.

Research

An overview of our scientific work

See our Research Projects