Congratulations to Sundong and Minwoo
[Title]
ESC: Erasing Space Concept for Knowledge Deletion
[Conference]
IEEE/CVF CVPR 2025 (Highlight, acceptance rate = 2.98%)
[Authors]
T.-Y. Lee†, Sundong Park†, Minwoo Jeon†, Hyoseok Hwang*, G.-M.Park*
[Code]
Link: https://github.com/KU-VGI/ESC
[Summary]
To address the growing demand for complete data removal from deep learning models, this paper introduces a new task called “Knowledge Deletion” (KD) and a corresponding metric, the “Knowledge Retention” (KR) score. We propose two novel methods: “Erasing Space Concept” (ESC), a fast, training-free approach that eliminates targeted knowledge by manipulating feature activations, and “ESC with Training” (ESC-T), which uses a learnable mask to better balance knowledge removal and model performance. Extensive experiments confirm our methods are faster, achieve state-of-the-art results, and are broadly applicable.
[Key Figures]
An overview of our methods. (a) We start with extracting the principal directions (U) of embedding features from the forgetting data using SVD. (b) ESC erases crucial directions from U, and we can get the pruned principal directions UP . (c) ESC-T enhances ESC by incorporating our forgetting loss, LP CE , refining the erasure process to eliminate only important elements rather than entire directions. ESC-T yields the refined principal directions UR, which improve the trade-off forgetting and preservation. (d) During the inference phase, we project the extracted features onto the subspace formed by UP or UR from each method.
[Key Results]
Accuracy, MIA, and KR performance in the KD setting are evaluated using CIFAR-10 with AllCNN, CIFAR-100 with ResNet-18, and Tiny-ImageNet with ViT. Dr indicates which methods use the remaining data during the unlearning process. The table presents the mean and standard deviation (mean ± std) across three trials. The best value is highlighted in bold, while the second-best is underlined. Additionally, the best result among methods that do not use Dr is colored in blue.