CVPR Daily - Thursday
3 DAILY CVPR Thursday Highlight - Award Candidate Event cameras differ from standard cameras in how they measure light at the image sensor. Instead of capturing frames continuously, event cameras only output a binary signal when there is a significant change in intensity above a predetermined threshold . Consequently, no information is obtained when an event camera is pointed at a static scene since there are no intensity changes. However, if an object moves within the scene, the pixel locations experiencing intensity changes will generate output events, primarily representing the edges surrounding the object. Event cameras draw inspiration from biological receptors in the human eye , which also focus on detecting changes in the scene. Our brain often disregards motion, but if it deems it relevant or significant, it uses that information downstream. Similarly, event cameras exclusively process changes and ignore static scenes to avoid processing redundant information. In this paper, Nico and Mathias explore using event cameras in combination with frame cameras to achieve robust feature tracking in sequential images. Feature tracking is crucial for various applications. The Robotics and Perception Group focuses on SLAM algorithms and pose estimation using cameras placed on robots . The researchers also work with drones , involving Nico Messikommer and Mathias Gehrig are PhD students in the lab of Davide Scaramuzza in the Robotics and Perception Group at the University of Zurich. Carter Fang was a master’s student at ETH and is now a Research Engineer at Waabi working on autonomous driving for trucks. All three are co-authors of a fantastic paper that is a candidate to win a significant award at CVPR later this year. Data-driven Feature Tracking for Event Cameras
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=