Adaptive Local Differential Privacy Approach for Privacy-Preserving Machine Learning

Main Article Content

Vipin Vijayachandran, R. Suchithra

Abstract

We live in a world where data is being generated and collected from IoT devices, browsers, applications, and mobile devices continuously, and the pressing demand for data processing techniques that preserve the privacy of individuals is growing day by day. Optimal approaches to preserving privacy vary based on applications and requirements, and different methods include anonymization, homomorphic encryption, and differential privacy. Differential privacy is considered one of the methods to ensure the privacy of the individuals in a data set. This research explores the application of local differential privacy in machine learning, how to adjust utility vs. trade-off based on the importance of features in the dataset, and the feedback mechanism from the aggregator for setting the privacy budget. The research aims to develop a sample model using the adaptive differential privacy method by utilizing feedback from the aggregator. We propose the AdaDPriv approach, which involves a meticulous feature selection process and the addition of noise to minimize privacy vs. utility trade-off and an assessment of its advantages in practical scenarios. The result suggests that the methods provide plausible deniability for the individual when there is a data leak, allowing the analysis to derive almost similar outputs as the original dataset.

Article Details

Section
Articles