期刊
IEEE TRANSACTIONS ON COMPUTERS
卷 72, 期 2, 页码 548-558出版社
IEEE COMPUTER SOC
DOI: 10.1109/TC.2022.3169436
关键词
Blockchains; Peer-to-peer computing; Servers; Training; Privacy; Convergence; Machine learning; Blockchain; byzantine resilience; decentralized learning; privacy preservation
This paper proposes SPDL, a blockchain-secured and privacy-preserving decentralized learning system that ensures efficient machine learning while maintaining data privacy, Byzantine fault tolerance, transparency, and traceability.
Decentralized learning involves training machine learning models over remote mobile devices, edge servers, or cloud servers while keeping data localized. Even though many studies have shown the feasibility of preserving privacy, enhancing training performance or introducing Byzantine resilience, but none of them simultaneously considers all of them. Therefore we face the following problem: how can we efficiently coordinate the decentralized learning process while simultaneously maintaining learning security and data privacy for the entire system? To address this issue, in this paper we propose SPDL, a blockchain-secured and privacy-preserving decentralized learning system. SPDL integrates blockchain, Byzantine Fault-Tolerant (BFT) consensus, BFT Gradients Aggregation Rule (GAR), and differential privacy seamlessly into one system, ensuring efficient machine learning while maintaining data privacy, Byzantine fault tolerance, transparency, and traceability. To validate our approach, we provide rigorous analysis on convergence and regret in the presence of Byzantine nodes. We also build a SPDL prototype and conduct extensive experiments to demonstrate that SPDL is effective and efficient with strong security and privacy guarantees.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据