Enhanced Privacy and Communication Efficiency in Non-IID Federated Learning With Adaptive Quantization and Differential Privacy

dc.contributor.authorArdic, Emre
dc.contributor.authorGenç, Yakup
dc.date.accessioned2025-10-29T11:15:55Z
dc.date.issued2025
dc.departmentFakülteler, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü
dc.description.abstractFederated learning (FL) is a distributed machine learning method where multiple devices collaboratively train a model under the management of a central server without sharing underlying data. One of the key challenges of FL is the communication bottleneck caused by variations in connection speed and bandwidth across devices. Therefore, it is essential to reduce the size of transmitted data during training. Additionally, there is a potential risk of exposing sensitive information through the model or gradient analysis during training. To address both privacy and communication efficiency, we combine differential privacy (DP) and adaptive quantization methods. We use Laplacian-based DP to preserve privacy, which is relatively underexplored in FL and offers tighter privacy guarantees than Gaussian-based DP. We propose a simple and efficient global bit-length scheduler using round-based cosine annealing, along with a client-based scheduler that dynamically adapts based on client contribution estimated through dataset entropy analysis. We evaluate our approach through extensive experiments on CIFAR10, MNIST, and medical imaging datasets, using non-IID data distributions across varying client counts, bit-length schedulers, and privacy budgets. The results show that our adaptive quantization methods reduce total communicated data by up to 52.64% for MNIST, 45.06% for CIFAR10, and 31% to 37% for medical imaging datasets compared to 32-bit float training while maintaining competitive model accuracy and ensuring robust privacy through DP.
dc.identifier.doi10.1109/ACCESS.2025.3554138
dc.identifier.endpage54337
dc.identifier.issn2169-3536
dc.identifier.scopus2-s2.0-105002269348
dc.identifier.scopusqualityQ1
dc.identifier.startpage54322
dc.identifier.urihttps://doi.org/10.1109/ACCESS.2025.3554138
dc.identifier.urihttps://hdl.handle.net/20.500.14854/7340
dc.identifier.volume13
dc.identifier.wosWOS:001462610000040
dc.identifier.wosqualityQ2
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.language.isoen
dc.publisherIEEE-Inst Electrical Electronics Engineers Inc
dc.relation.ispartofIEEE Access
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/openAccess
dc.snmzKA_WOS_20251020
dc.subjectQuantization (signal)
dc.subjectTraining
dc.subjectPrivacy
dc.subjectServers
dc.subjectComputational modeling
dc.subjectAdaptation models
dc.subjectAccuracy
dc.subjectDifferential privacy
dc.subjectEntropy
dc.subjectData models
dc.subjectFederated learning
dc.subjectadaptive quantization
dc.subjectdifferential privacy
dc.subjectnon-IID distribution
dc.titleEnhanced Privacy and Communication Efficiency in Non-IID Federated Learning With Adaptive Quantization and Differential Privacy
dc.typeArticle

Dosyalar