Main page Research activities Publications Talks MSc thesis projects Courses Mentoring Hobby and spare time Write me This site uses
Google Analytics
Last updated on
18 March 2024

Publication details

F. Galli, S. Biswas, K. Jung, T. Cucinotta, C. Palamidessi. "Group Privacy for Personalized Federated Learning," in Proceedings of the 9th International Conference on Information Systems Security and Privacy (ICISSP 2023), February 22-24, 2023, Lisbon, Portugal.

Abstract

Federated learning (FL) is a particular type of distributed, collaborative machine learning, where participating clients process their data locally, sharing only updates of the training process. Generally, the goal is the privacy-aware optimization of a statistical model's parameters by minimizing a cost function of a collection of datasets which are stored locally by a set of clients. This process exposes the clients to two issues: leakage of private information and lack of personalization of the model. To mitigate the former, differential privacy and its variants serve as a standard for providing formal privacy guarantees. But often the clients represent very heterogeneous communities and hold data which are very diverse. Therefore, aligned with the recent focus of the FL community to build a framework of personalized models for the users representing their diversity, it is of utmost importance to protect the clients' sensitive and personal information against potential threats. To address this goal we consider $d$-privacy, also known as metric privacy, which is a variant of local differential privacy, using a metric-based obfuscation technique that preserves the topological distribution of the original data. To cope with the issues of protecting the privacy of the clients and allowing for personalized model training, we propose a method to provide group privacy guarantees exploiting some key properties of $d$-privacy which enables personalized models under the framework of FL. We provide theoretical justifications to the applicability and experimental validation on real-world datasets to illustrate the working of the proposed method.

Copyright by INSTICC.

See paper on publisher's website

Download paper

DOI: 10.5220/0011885000003405

BibTeX entry:

@inproceedings{Galli2023,
	doi = {10.5220/0011885000003405},
	url = {https://doi.org/10.5220%2F0011885000003405},
	year = 2023,
	publisher = {{SCITEPRESS} - Science and Technology Publications},
	author = {Filippo Galli and Sayan Biswas and Kangsoo Jung and Tommaso Cucinotta and Catuscia Palamidessi},
	title = {Group Privacy for Personalized Federated Learning},
	booktitle = {Proceedings of the 9th International Conference on Information Systems Security and Privacy}
}

Main page Research activities Publications Talks MSc thesis projects Courses Mentoring Hobby and spare time Write me Last updated on
18 March 2024