Hoang Ngoc Tran , Nguyen Trung Nguyen , Nghi Vinh Nguyen , Ha Xuan Nguyen , Anh Duy Nguyen
{"title":"FAST: A pioneering unlearning framework integrating fine-tuning, adverse training, and student–teacher methods","authors":"Hoang Ngoc Tran , Nguyen Trung Nguyen , Nghi Vinh Nguyen , Ha Xuan Nguyen , Anh Duy Nguyen","doi":"10.1016/j.jestch.2025.101996","DOIUrl":null,"url":null,"abstract":"<div><div>In the evolving field of machine unlearning, the imperative to protect data privacy while maintaining essential information has become increasingly critical. This paper introduces a pioneering unlearning framework named FAST (Fine-tuning, Adverse Training, and Student–Teacher Methods). FAST is designed to selectively erase privacy-sensitive data while robustly safeguarding valuable information. It employs an innovative integration of the student–teacher architecture, utilizing targeted data and a sophisticated distribution structure refined by the Kullback–Leibler (KL) divergence within the loss function. Enhanced by fine-tuning and adverse training techniques, this integration amplifies beneficial knowledge from competent teachers and reduces ineffective knowledge from less capable ones, thereby enabling more effective and efficient unlearning processes. Furthermore, a new evaluation method called the Unlearning Effectiveness Score (UES) has been proposed for our unlearning model, aimed at providing a comprehensive metric to assess the effectiveness of the unlearning process. This approach rigorously evaluates the model using two sophisticated methods: the Zero Retrain Forgetting (ZRF) metric and Membership Inference Attacks (MIA). The UES is designed to not only measure the effectiveness of forgetting but also to ensure that the model is not easily susceptible to these attacks. These methods facilitate comprehensive comparisons with previous techniques such as bad teaching, Amnesiac, SCRUB, and straightforward fine-tuning. Our thorough experimental analysis, conducted across a variety of deep networks including MobileNet, ResNet, and VGG on the CIFAR and MUFAC datasets, confirms that FAST substantially surpasses existing approaches in both effectively forgetting targeted data and retaining necessary information, as demonstrated by its superior performance on UES metrics.</div></div>","PeriodicalId":48609,"journal":{"name":"Engineering Science and Technology-An International Journal-Jestech","volume":"64 ","pages":"Article 101996"},"PeriodicalIF":5.1000,"publicationDate":"2025-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering Science and Technology-An International Journal-Jestech","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2215098625000515","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
In the evolving field of machine unlearning, the imperative to protect data privacy while maintaining essential information has become increasingly critical. This paper introduces a pioneering unlearning framework named FAST (Fine-tuning, Adverse Training, and Student–Teacher Methods). FAST is designed to selectively erase privacy-sensitive data while robustly safeguarding valuable information. It employs an innovative integration of the student–teacher architecture, utilizing targeted data and a sophisticated distribution structure refined by the Kullback–Leibler (KL) divergence within the loss function. Enhanced by fine-tuning and adverse training techniques, this integration amplifies beneficial knowledge from competent teachers and reduces ineffective knowledge from less capable ones, thereby enabling more effective and efficient unlearning processes. Furthermore, a new evaluation method called the Unlearning Effectiveness Score (UES) has been proposed for our unlearning model, aimed at providing a comprehensive metric to assess the effectiveness of the unlearning process. This approach rigorously evaluates the model using two sophisticated methods: the Zero Retrain Forgetting (ZRF) metric and Membership Inference Attacks (MIA). The UES is designed to not only measure the effectiveness of forgetting but also to ensure that the model is not easily susceptible to these attacks. These methods facilitate comprehensive comparisons with previous techniques such as bad teaching, Amnesiac, SCRUB, and straightforward fine-tuning. Our thorough experimental analysis, conducted across a variety of deep networks including MobileNet, ResNet, and VGG on the CIFAR and MUFAC datasets, confirms that FAST substantially surpasses existing approaches in both effectively forgetting targeted data and retaining necessary information, as demonstrated by its superior performance on UES metrics.
期刊介绍:
Engineering Science and Technology, an International Journal (JESTECH) (formerly Technology), a peer-reviewed quarterly engineering journal, publishes both theoretical and experimental high quality papers of permanent interest, not previously published in journals, in the field of engineering and applied science which aims to promote the theory and practice of technology and engineering. In addition to peer-reviewed original research papers, the Editorial Board welcomes original research reports, state-of-the-art reviews and communications in the broadly defined field of engineering science and technology.
The scope of JESTECH includes a wide spectrum of subjects including:
-Electrical/Electronics and Computer Engineering (Biomedical Engineering and Instrumentation; Coding, Cryptography, and Information Protection; Communications, Networks, Mobile Computing and Distributed Systems; Compilers and Operating Systems; Computer Architecture, Parallel Processing, and Dependability; Computer Vision and Robotics; Control Theory; Electromagnetic Waves, Microwave Techniques and Antennas; Embedded Systems; Integrated Circuits, VLSI Design, Testing, and CAD; Microelectromechanical Systems; Microelectronics, and Electronic Devices and Circuits; Power, Energy and Energy Conversion Systems; Signal, Image, and Speech Processing)
-Mechanical and Civil Engineering (Automotive Technologies; Biomechanics; Construction Materials; Design and Manufacturing; Dynamics and Control; Energy Generation, Utilization, Conversion, and Storage; Fluid Mechanics and Hydraulics; Heat and Mass Transfer; Micro-Nano Sciences; Renewable and Sustainable Energy Technologies; Robotics and Mechatronics; Solid Mechanics and Structure; Thermal Sciences)
-Metallurgical and Materials Engineering (Advanced Materials Science; Biomaterials; Ceramic and Inorgnanic Materials; Electronic-Magnetic Materials; Energy and Environment; Materials Characterizastion; Metallurgy; Polymers and Nanocomposites)