Undergraduate thesis exploring privacy-utility trade-offs in federated learning using homomorphic encryption, differential privacy, and a novel sequential hybrid approach.
This undergraduate thesis systematically evaluates software-based privacy mechanisms in federated learning. We implemented and compared five FL configurations — baseline, homomorphic encryption (HE), differential privacy (DP), standard hybrid (simultaneous HE+DP), and a novel sequential hybrid approach (DP during training, HE during aggregation) — on CIFAR-10 image classification. The sequential hybrid is our key contribution: by temporally separating privacy mechanisms, it prevents error compounding and improves learning stability over the standard simultaneous application. All experiments were containerized with Docker for reproducibility.
68.34%
Baseline Accuracy
57.90%
HE Accuracy
35.05%
DP Accuracy
33.51%
Sequential Hybrid
92.1%
HE non-IID Retention
94%
HE Comm. Reduction
Built a modular FL framework in PyTorch with a client-server architecture using FedAvg. Implemented DP via Opacus (gradient clipping + calibrated Gaussian noise) and HE via Pyfhel (CKKS scheme with parameter quantization and chunking). Evaluated across IID and non-IID (Dirichlet α=0.5, α=0.1) data distributions with 10 clients. Privacy was measured through membership inference attacks (threshold-based and shadow model approaches). Resource overhead tracked computation time, communication costs, and memory.
Privacy-Utility Trade-offs in Federated Learning for 6G Networks: A Systematic Evaluation of Software-based Privacy Mechanisms
IEEE conference paper presenting the systematic evaluation framework and sequential hybrid approach for privacy-preserving FL in next-generation networks.
Exploring Federated Learning and Encryption — Undergraduate Thesis
Full 33-page thesis with comprehensive background, methodology, results, and discussion on privacy-preserving federated learning.