TY - CHAP
T1 - Privacy Preserving Multi-party Learning Framework for Enhanced Personalized Neuro-Rehabilitation
AU - Masud, Mohammad M.
AU - Rasheed, Ashika Sameem Abdul
AU - Zhu, Xiaojie
AU - Al-Rajab, Murad
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.
PY - 2024
Y1 - 2024
N2 - Many challenging problems related to neuro-rehabilitation require building effective machine learning models for decision support. Examples include abnormal foot movement detection, fall detection or prediction, upper limb activity detection, and gesture recognition. The data collected from a rehabilitation facility may be used for training such models. A better model can be built if the training data from multiple rehabilitation facilities can be combined. However, data sharing between multiple facilities poses privacy issues. In order to address this, researchers proposed federated learning (FL), which can take advantage of multiple facility data to build a global model by only sharing the locally trained model, without needing to share data. However, recently it has been shown that model sharing may also lead to privacy leak. To address this issue, fully homomorphic encryption (FHE) based FL is proposed, where a facility (or a client) encrypts the locally trained model before sending it to the FL server. The server performs aggregation of the encrypted models using the FHE, and shares the encrypted aggregated models to the clients. This approach effectively prevents privacy leak. In this work we have identified several trade-offs related to this approach by simulating different scenarios involving varying number of clients, encryption strategies, and base machine learning models. Results indicate higher running time and memory requirement for larger number of clients, and number of encrypted model parameters, with a little or no reduction in precision of the learned model compared to classical federated learning paradigm.
AB - Many challenging problems related to neuro-rehabilitation require building effective machine learning models for decision support. Examples include abnormal foot movement detection, fall detection or prediction, upper limb activity detection, and gesture recognition. The data collected from a rehabilitation facility may be used for training such models. A better model can be built if the training data from multiple rehabilitation facilities can be combined. However, data sharing between multiple facilities poses privacy issues. In order to address this, researchers proposed federated learning (FL), which can take advantage of multiple facility data to build a global model by only sharing the locally trained model, without needing to share data. However, recently it has been shown that model sharing may also lead to privacy leak. To address this issue, fully homomorphic encryption (FHE) based FL is proposed, where a facility (or a client) encrypts the locally trained model before sending it to the FL server. The server performs aggregation of the encrypted models using the FHE, and shares the encrypted aggregated models to the clients. This approach effectively prevents privacy leak. In this work we have identified several trade-offs related to this approach by simulating different scenarios involving varying number of clients, encryption strategies, and base machine learning models. Results indicate higher running time and memory requirement for larger number of clients, and number of encrypted model parameters, with a little or no reduction in precision of the learned model compared to classical federated learning paradigm.
UR - http://www.scopus.com/inward/record.url?scp=85214021873&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85214021873&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-77584-0_89
DO - 10.1007/978-3-031-77584-0_89
M3 - Chapter
AN - SCOPUS:85214021873
T3 - Biosystems and Biorobotics
SP - 456
EP - 460
BT - Biosystems and Biorobotics
PB - Springer Science and Business Media Deutschland GmbH
ER -