Computer Vision News 12 Underrated Papers by Mert Sabuncu In the fast-moving field of machine learning and computer vision, as we tackle new challenges (such as robustness, fairness, interpretability, calibration, and human-in-the-loop AI), we often suffer from a recency bias – our approaches are constrained by recently popular techniques and modeling choices. In this work, led by Alan Wang, we revisited a classic paradigm, known as the Nadaraya-Watson (NW) estimator and married it with more recent neural network architectures. To be sure, this has been done by others and thus is not a new approach. What we show here is that this so-called NW head offers unique advantages for interpretability and calibration and further promises a new approach for other important challenges we face, such as robustness and fairness. This is why I advised Ralph to start this new series of “underrated papers” with Alan’s work, as it effectively demonstrates the potential of reevaluating traditional methods from a fresh perspective. We are starting today a new section of Computer Vision News. How many papers that have been published are underrated? Did any of your past papers deserve better fortunes? This section will give a second life to neglected papers that are worth having a look at. Mert Sabuncu, Professor at Cornell University, was kind enough to be the first professor to play the game. Mert thinks that this paper is underrated and has asked first author Alan Wang to tell us about this work. But first, Mert tells us why he thinks we need to have a second look! Let’s point out that Alan, PhD candidate at Cornell, is on the academic job market and he’s a great catch! Alan Wang A Flexible Nadaraya-Watson Head Can Offer Explainable and Calibrated Classification
RkJQdWJsaXNoZXIy NTc3NzU=