Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Defending Against Membership Inference Attacks on Iteratively Pruned Deep Neural Networks
2
Zitationen
7
Autoren
2025
Jahr
Abstract
Model pruning is a technique for compressing deep learning models, and using an iterative way to prune the model can achieve better compression effects with lower utility loss.However, our analysis reveals that iterative pruning significantly increases model memorization, making the pruned models more vulnerable to membership inference attacks (MIAs).Unfortunately, the vast majority of existing defenses against MIAs are designed for original and unpruned models.In this paper, we propose a new framework WEMEM to weaken memorization in the iterative pruning process.Specifically, our analysis identifies two important factors that increase memorization in iterative pruning, namely data reuse and inherent memorability.We consider the individual and combined impacts of both factors, forming three scenarios that lead to increased memorization in iteratively pruned models.We design three defense primitives based on these factors' characteristics.By combining these primitives, we propose methods tailored to each scenario to weaken memorization effectively.Comprehensive experiments under ten adaptive MIAs demonstrate the effectiveness of the proposed defenses.Moreover, our defenses outperform five existing defenses in terms of privacy-utility tradeoff and efficiency.Additionally, we enhance the proposed defenses to automatically adjust settings for optimal defense, improving their practicability.
Ähnliche Arbeiten
Rethinking the Inception Architecture for Computer Vision
2016 · 30.575 Zit.
MobileNetV2: Inverted Residuals and Linear Bottlenecks
2018 · 24.777 Zit.
CBAM: Convolutional Block Attention Module
2018 · 21.650 Zit.
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
2020 · 21.438 Zit.
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
2015 · 18.625 Zit.