报告题目：Fully Test-time Adaptation in the Era of Foundation Models
报告摘要：A model must adapt itself to generalize to new and different data during testing. In this setting of fully test-time adaptation the model has only the test data and its own parameters. We propose to adapt by test entropy minimization (tent): we optimize the model for confidence as measured by the entropy of its predictions. Tent estimates normalization statistics and optimizes channel-wise affine transformations to update online on each batch. We also explore the opportunity of dynamic defenses, to adapt the model and input during testing, by defensive entropy minimization (dent). Dent alters testing, but not training, for compatibility with existing models and train-time defenses. In addition, we argue for optimizing as much as possible on the target data, since target accuracy is the goal. The proposed on-target adaptation framework achieves better speed-accuracy tradeoff, which is more important in the era of foundation models.
王德泉，复旦大学学士，加州大学伯克利分校博士，师从Prof. Trevor Darrell 研究领域主要围绕计算机视觉、机器学习等方向，Google Scholar 近五年总引用次数 2900 余次。博士期间研究工作围绕“如何构建具有泛化能力的智能体”展开，即探索如何设计自适应（adaptive）、动态（dynamic）、普适（universal）的新一代视觉感知系统。