TY - GEN
T1 - Quantum Federated Learning
T2 - 17th IEEE/ACM International Conference on Utility and Cloud Computing, UCC 2024
AU - Qayyum, Tariq
AU - Khan, M. Waqas Haseeb
AU - Tariq, Asadullah
AU - Serhani, Mohamed
AU - Sallabi, Farag M.
AU - Trabelsi, Zouheir
AU - Taleb, Ikbal
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - In recent years, the growing demand for secure, efficient, and scalable machine learning has highlighted significant challenges in traditional Federated Learning (FL) systems. These challenges include computational inefficiencies in handling large-scale, high-dimensional data, privacy concerns, and the limitations of classical hardware in solving complex optimization problems. Additionally, the increasing complexity of data and the need for more robust privacy-preserving methods have pushed the boundaries of classical FL. To address these issues, this paper presents a detailed framework for Quantum Federated Learning (QFL), where quantum computing and FL converge to enable distributed quantum model training with enhanced privacy preservation. The system is modeled using local quantum models on clients and a global aggregation strategy at the server. We develop quantum gradient-based optimization with quantum cross-entropy loss, privacy preservation through encryption, and robustness to quantum noise. A set of theorems is provided to prove the correctness and efficiency of the model, supported by rigorous mathematical formulations. We used IBM Qiskit for the simulation and compared our proposed QFL with classical FL. The results demonstrate that our QFL outperformed classical FL in terms of accuracy, precision, recall, and F1 score.
AB - In recent years, the growing demand for secure, efficient, and scalable machine learning has highlighted significant challenges in traditional Federated Learning (FL) systems. These challenges include computational inefficiencies in handling large-scale, high-dimensional data, privacy concerns, and the limitations of classical hardware in solving complex optimization problems. Additionally, the increasing complexity of data and the need for more robust privacy-preserving methods have pushed the boundaries of classical FL. To address these issues, this paper presents a detailed framework for Quantum Federated Learning (QFL), where quantum computing and FL converge to enable distributed quantum model training with enhanced privacy preservation. The system is modeled using local quantum models on clients and a global aggregation strategy at the server. We develop quantum gradient-based optimization with quantum cross-entropy loss, privacy preservation through encryption, and robustness to quantum noise. A set of theorems is provided to prove the correctness and efficiency of the model, supported by rigorous mathematical formulations. We used IBM Qiskit for the simulation and compared our proposed QFL with classical FL. The results demonstrate that our QFL outperformed classical FL in terms of accuracy, precision, recall, and F1 score.
KW - Distributed Learning
KW - Privacy Preservation
KW - Quantum Federated Learning
KW - Quantum Gradient Descent
KW - Quantum Noise
UR - http://www.scopus.com/inward/record.url?scp=105004740453&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=105004740453&partnerID=8YFLogxK
U2 - 10.1109/UCC63386.2024.00053
DO - 10.1109/UCC63386.2024.00053
M3 - Conference contribution
AN - SCOPUS:105004740453
T3 - Proceedings - 2024 IEEE/ACM 17th International Conference on Utility and Cloud Computing, UCC 2024
SP - 327
EP - 335
BT - Proceedings - 2024 IEEE/ACM 17th International Conference on Utility and Cloud Computing, UCC 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 16 December 2024 through 19 December 2024
ER -