People

Guillermo Gallego

Principal Investigator

Computer Vision

TU Berlin

 

Email:
guillermo.gallego@tu-berlin.de

 

Photo: SCIoI

← People Overview

Guillermo Gallego

Guillermo Gallego

Photo: SCIoI

Guillermo Gallego is Professor of Robotic Interactive Perception at Technische Universität Berlin and Einstein Center Digital Future (ECDF). For SCIoI, he works on computer vision and robotics. He focuses on robot perception and on optimization methods for interdisciplinary imaging and control problems. Inspired by the human visual system, he works toward improving the perception systems of artificial agents, endowing them with intelligence to transform raw sensor data into knowledge, to provide autonomy in changing environments.


Projects

Guillermo Gallego is member of Project 36.


Reinold, T., Ghosh, S., & Gallego, G. (2025). Combined Physics and Event Camera Simulator for Slip Detection. WACV Workshop on Event-Based Vision in the Era of Generative AI.
Wischow, M., Irmisch, P., Boerner, A., & Gallego, G. (2024). Real-Time Noise Source Estimation of a Camera Systemfrom an Image and Metadata. Advanced Intelligent Systems, 2300479, 1–15. https://doi.org/10.1002/aisy.202300479
Shiba, S., Klose, Y., Aoki, Y., & Gallego, G. (2024). Secrets of Event-based Optical Flow, Depth and Ego-motion Estimation by Contrast Maximization. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1–18. https://doi.org/10.1109/TPAMI.2024.3396116
Rodriguez-Gomez, J. P., Martinez-de Dios, J. R., Ollero, A., & Gallego, G. (2024). On the Benefits of Visual Stabilization for Frame- and Event-based Perception. Robotics and Automation Letters, 9(10), 8802–8809. https://doi.org/10.1109/LRA.2024.3450290
Hamann, F., Ghosh, S., Martínez, I. J., Hart, T., Kacelnik, A., & Gallego, G. (2024). Fourier-based Action Recognition for Wildlife Behavior Quantification with Event Cameras. Advanced Intelligent Systems, 2400353. https://doi.org/10.1002/aisy.202400353
Guo, S., & Gallego, G. (2024). CMax-SLAM: Event-based Rotational-Motion Bundle Adjustment and SLAM System using Contrast Maximization. IEEE Transactions on Robotics. https://doi.org/10.1109/TRO.2024.3378443
Wang, Z., Hamann, F., Chaney, K., Jiang, W., Gallego, G., & Daniilidis, K. (2024). Event-based Continuous Color Video Decompression from Single Frames. submitted for review. https://doi.org/10.48550/arXiv.2312.00113
Niu, J., Zhong, S., Lu, X., Shen, S., Gallego, G., & Zhou, Y. (2024). ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras. arXiv. https://doi.org/10.48550/arXiv.2410.09374
Hamann, F., Gehrig, D., Febryanto, F., Daniilidis, K., & Gallego, G. (2024). Event-based Tracking of Any Point with Motion-Robust Correlation Features. arXiv. https://doi.org/10.48550/arXiv.2412.00133
Ren, Z., Liao, B., Kong, D., Li, J., Liu, P., Kneip, L., Gallego, G., & Zhou, Y. (2024). Motion and Structure from Event-based Normal Flow. European Conference on Computer Vision (ECCV), 108–125. https://doi.org/10.1007/978-3-031-72992-8_7
Hamann, F., Ghosh, S., Martínez, I. J., Hart, T., Kacelnik, A., & Gallego, G. (2024). Low-power, Continuous Remote Behavioral Localization with Event Cameras. CVPR, 18612–18621. https://doi.org/10.1109/CVPR52733.2024.01761
Hamann, F., Wang, Z., Asmanis, I., Chaney, K., Gallego, G., & Daniilidis, K. (2024). Motion-prior Contrast Maximization for Dense Continuous-Time Motion Estimation. European Conference on Computer Vision (ECCV), 18–37. https://doi.org/10.1007/978-3-031-72646-0_2
Hamann, F., Li, H., Mieske, P., Lewejohann, L., & Gallego, G. (2024). MouseSIS: A Frames-and-Events Dataset for Space-Time Instance Segmentation of Mice. ECCVW. https://doi.org/10.48550/arXiv.2409.03358
Guo, S., & Gallego, G. (2024). Event-based Mosaicing Bundle Adjustment. European Conference on Computer Vision (ECCV), 479–496. https://doi.org/10.1007/978-3-031-72624-8_27
Ghosh, S., Cavinato, V., & Gallego, G. (2024). ES-PTAM: Event-based Stereo Parallel Tracking and Mapping. ECCVW. https://doi.org/10.48550/arXiv.2408.15605
Shiba, S., Hamann, F., Aoki, Y., & Gallego, G. (2023). Event-based Background-Oriented Schlieren. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1–16. https://doi.org/10.1109/TPAMI.2023.3328188
Zhang, Z., Yezzi, A., & Gallego, G. (2022). Formulating Event-based Image Reconstruction as a Linear Inverse Problem with Deep Regularization using Optical Flow. IEEE Transactions on Pattern Analysis and Machine Intelligence. https://doi.org/10.1109/TPAMI.2022.3230727
Shiba, S., Aoki, Y., & Gallego, G. (2022). Fast Event-based Optical Flow Estimation by Triplet Matching. Signal Processing Letters, 29, 2712–2716. https://doi.org/10.1109/LSP.2023.3234800
Shiba, S., Aoki, Y., & Gallego, G. (2022). A Fast Geometric Regularizer to Mitigate Event Collapse in the Contrast Maximization Framework. Advanced Intelligent Systems, 11. https://doi.org/10.1002/aisy.202200251
Hamann, F., & Gallego, G. (2022). Stereo Co-capture System for Recording and Tracking Fish with Frame- and Event Cameras. International Conference on Pattern Recognition (ICPR), Workshop on Visual observation and analysis of Vertebrate And Insect Behavior. https://arxiv.org/abs/2207.07332
Ghosh, S., & Gallego, G. (2022). Event-based Stereo Depth Estimation from Ego-motion using Ray Density Fusion. ECCVW Ego4D 2022. https://arxiv.org/abs/2210.08927
Shiba, S., Aoki, Y., & Gallego, G. (2022). Secrets of Event-Based Optical Flow. European Conference on Computer Vision (ECCV), 628–645. https://doi.org/10.1007/978-3-031-19797-0_36
Gu, C., Learned-Miller, E., Gallego, G., Sheldon, D., & Bideau, P. (2021). The Spatio-Temporal Poisson Point process: A simple Model for the Alignment of Event Camera Data. International Conference on Computer Vision (ICCV), 13495–13504. https://doi.org/10.1109/ICCV48922.2021.01324

Outstanding Associate Editor (IEEE Robotics Automation Letters, 2021)

Der Tagesspiegel – November 2020 - Roboter mit Sinn für Orientierung

Research

An overview of our scientific work

See our Research Projects