人工智能(Artificial Intelligence, AI)进入以深度学习为主导的大数据时代,基于大数据的机器学习既推动了AI的蓬勃发展,也带来了一系列安全隐患。这些隐患来源于深度学习本身的学习机制,无论是在它的模型建造(训练)阶段,还是在模型推理和使用阶段。这些安全隐患如果被有意或无意地滥用,后果将十分严重。
联邦学习是一种 隐私保护、数据本地存储与计算 的机器学习算法。
- Federated Learning Comic
- Federated Learning: Collaborative Machine Learning without Centralized Training Data
- GDPR, Data Shotrage and AI (AAAI-19)
- Federated Learning: Machine Learning on Decentralized Data (Google I/O'19)
- Federated Learning White Paper V1.0
- Federated learning: distributed machine learning with data locality and privacy
- Federated Learning: Challenges, Methods, and Future Directions
- Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
- Federated Learning in Mobile Edge Networks: A Comprehensive Survey
- Federated Learning for Wireless Communications: Motivation, Opportunities and Challenges
- Convergence of Edge Computing and Deep Learning: A Comprehensive Survey
- Advances and Open Problems in Federated Learning
- Federated Machine Learning: Concept and Applications
- Threats to Federated Learning: A Survey
- Survey of Personalization Techniques for Federated Learning
- LEAF: A Benchmark for Federated Settings](https://github.com/TalwalkarLab/leaf) [Recommend]
- A Performance Evaluation of Federated Learning Algorithms
- Edge AIBench: Towards Comprehensive End-to-end Edge Computing Benchmarking
- One-Shot Federated Learning
- Federated Learning with Unbiased Gradient Aggregation and Controllable Meta Updating (NIPS 2019 Workshop)
- Bayesian Nonparametric Federated Learning of Neural Networks (ICML 2019)
- Agnostic Federated Learning (ICML 2019)
- Federated Learning with Matched Averaging (ICLR 2020)
- Astraea: Self-balancing federated learning for improving classification accuracy of mobile deep learning applications
- FedPD: A Federated Learning Framework with Optimal Rates andAdaptivity to Non-IID Data
- Decentralized Learning of Generative Adversarial Networks from Non-iid Data
- Towards Class Imbalance in Federated Learning
- Communication-Efficient On-Device Machine Learning:Federated Distillation and Augmentationunder Non-IID Private Data
- Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization
- Federated Adversarial Domain Adaptation
- Federated Learning with Only Positive Labels
- Federated Learning with Non-IID Data
- The Non-IID Data Quagmire of Decentralized Machine Learning
- Robust and Communication-Efficient Federated Learning from Non-IID Data [IEEE transactions on neural networks and learning systems]
- FedMD: Heterogenous Federated Learning via Model Distillation (NIPS 2019 Workshop)
- First Analysis of Local GD on Heterogeneous Data
- SCAFFOLD: Stochastic Controlled Averaging for On-Device Federated Learning
- Federated Optimization for Heterogeneous Networks
- On the Convergence of FedAvg on Non-IID Data [OpenReview]
- Agnostic Federated Learning (ICML 2019)
- Local SGD Converges Fast and Communicates Little
- Improving Federated Learning Personalization via Model Agnostic Meta Learning (NIPS 2019 Workshop)
- Adaptive Gradient-Based Meta-Learning Methods (NIPS 2019 Workshop)
- Federated Adversarial Domain Adaptation (ICLR 2020)
- LoAdaBoost: Loss-Based AdaBoost Federated Machine Learning on Medical Data
- On Federated Learning of Deep Networks from Non-IID Data: Parameter Divergence and the Effects of Hyperparametric Methods [Rejected in ICML 2020]
- Overcoming Forgetting in Federated Learning on Non-IID Data [NIPS 2019 Workshop]
- FedMAX: Activation Entropy Maximization Targeting Effective Non-IID Federated Learning [NIPS 2019 Workshop]
- Measuring the Effects of Non-Identical Data Distribution for Federated Visual Classification [NIPS 2019 Workshop]
- Fair Resource Allocation in Federated Learning
- Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data
- Think Locally, Act Globally: Federated Learning with Local and Global Representations [NIPS 2019 Workshop]
- Federated Meta-Learning with Fast Convergence and Efficient Communication
- Federated Meta-Learning for Recommendation
- Adaptive Gradient-Based Meta-Learning Methods
- MOCHA: Federated Multi-Task Learning [NIPS 2017] [Slides]
- Variational Federated Multi-Task Learning
- Federated Kernelized Multi-Task Learning
- Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints [NIPS 2019 Workshop]
- A Linear Speedup Analysis of Distributed Deep Learning with Sparse and Quantized Communication [NIPS 2018]
- FetchSGD: Communication-Efficient Federated Learning with Sketching
- Federated Learning for Wireless Communications: Motivation, Opportunities and Challenges
- On the Convergence of FedAvg on Non-IID Data
- SCAFFOLD: Stochastic Controlled Averaging for On-Device Federated Learning
- Federated Optimization for Heterogeneous Networks
- On the Convergence of FedAvg on Non-IID Data [OpenReview]
- Can Decentralized Algorithms Outperform Centralized Algorithms? A Case Study for Decentralized Parallel Stochastic Gradient Descent [NIPS 2017]
- Communication Efficient Decentralized Training with Multiple Local Updates
- First Analysis of Local GD on Heterogeneous Data
- MATCHA: Speeding Up Decentralized SGD via Matching Decomposition Sampling
- Local SGD Converges Fast and Communicates Little
- SlowMo: Improving Communication-Efficient Distributed SGD with Slow Momentum
- Adaptive Federated Learning in Resource Constrained Edge Computing Systems [IEEE Journal on Selected Areas in Communications, 2019]
- Parallel Restarted SGD with Faster Convergence and Less Communication: Demystifying Why Model Averaging Works for Deep Learning [AAAI 2018]
- On the Linear Speedup Analysis of Communication Efficient Momentum SGD for Distributed Non-Convex Optimization [ICML 2019]
- Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data
- Convergence of Distributed Stochastic Variance Reduced Methods without Sampling Extra Data [NIPS 2019 Workshop]
- Towards Federated Learning at Scale: System Design [Must Read]
- Demonstration of Federated Learning in a Resource-Constrained Networked Environment
- Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
- Applied Federated Learning: Improving Google Keyboard Query Suggestions
- Federated Learning and Differential Privacy: Software tools analysis, the Sherpa.ai FL framework and methodological guidelines for preserving data privacy (Startup)
- Communication-Efficient Learning of Deep Networks from Decentralized Data](https://github.com/roxanneluo/Federated-Learning) [Google] [Must Read]
- Two-Stream Federated Learning: Reduce the Communication Costs [2018 IEEE VCIP]
- Client-Edge-Cloud Hierarchical Federated Learning
- PowerSGD: Practical Low-Rank Gradient Compression for Distributed Optimization [NIPS 2019], Thijs Vogels, Sai Praneeth Karimireddy, and Martin Jaggi.
- Deep Gradient Compression: Reducing the Communication Bandwidth for Distributed Training [ICLR 2018] Yujun Lin, Song Han, Huizi Mao, Yu Wang, and William J Dally
- The Error-Feedback Framework: Better Rates for SGD with Delayed Gradients and Compressed Communication Sebastian U Stich and Sai Praneeth Karimireddy, 2019.
- A Communication Efficient Collaborative Learning Framework for Distributed Features [NIPS 2019 Workshop]
- Active Federated Learning [NIPS 2019 Workshop]
- Communication-Efficient Distributed Optimization in Networks with Gradient Tracking and Variance Reduction [NIPS 2019 Workshop]
- Gradient Descent with Compressed Iterates [NIPS 2019 Workshop]
- Robust and Communication-Efficient Federated Learning from Non-IID Data, 2019
- Expanding the Reach of Federated Learning by Reducing Client Resource Requirements Sebastian Caldas, Jakub Konecny, H Brendan McMahan, and Ameet Talwalkar, 2018
- Federated Learning: Strategies for Improving Communication Efficiency [NIPS2016 Workshop] [Google]
- Natural Compression for Distributed Deep Learning Samuel Horvath, Chen-Yu Ho, Ludovit Horvath, Atal Narayan Sahu, Marco Canini, and Peter Richtarik, 2019.
- FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization, 2019
- ATOMO: Communication-efficient Learning via Atomic Sparsification [NIPS 2018], H. Wang, S. Sievert, S. Liu, Z. Charles, D. Papailiopoulos, and S. Wright.
- vqSGD: Vector Quantized Stochastic Gradient Descent Venkata Gandikota, Raj Kumar Maity, and Arya Mazumdar, 2019.
- QSGD: Communication-efficient SGD via gradient quantization and encoding [NIPS 2017], Dan Alistarh, Demjan Grubic, Jerry Li, Ryota Tomioka, and Milan Vojnovic.
- cpSGD: Communication-efficient and differentially-private distributed SGD
- Federated Optimization: Distributed Machine Learning for On-Device Intelligence [Google]
- Distributed Mean Estimation with Limited Communication [ICML 2017], Ananda Theertha Suresh, Felix X. Yu, Sanjiv Kumar, and H Brendan McMahan.
- Randomized Distributed Mean Estimation: Accuracy vs Communication Frontiers in Applied Mathematics and Statistics, Jakub Konecny and Peter Richtarik, 2016
- Error Feedback Fixes SignSGD and other Gradient Compression Schemes [ICML 2019], Sai Praneeth Karimireddy, Quentin Rebjock, Sebastian Stich, and Martin Jaggi.
- ZipML: Training Linear Models with End-to-End Low Precision, and a Little Bit of Deep Learning [ICML 2017], H. Zhang, J. Li, K. Kara, D. Alistarh, J. Liu, and C. Zhang.
- eSGD: Communication Efficient Distributed Deep Learning on the Edge [USENIX 2018 Workshop (HotEdge 18)]
- CMFL: Mitigating Communication Overhead for Federated Learning
- Communication Compression for Decentralized Training [NIPS 2018], H. Tang, S. Gan, C. Zhang, T. Zhang, and J. Liu.
- 𝙳𝚎𝚎𝚙𝚂𝚚𝚞𝚎𝚎𝚣𝚎: Decentralization Meets Error-Compensated Compression Hanlin Tang, Xiangru Lian, Shuang Qiu, Lei Yuan, Ce Zhang, Tong Zhang, and Ji Liu, 2019
- Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge (FedCS)
- Hybrid-FL for Wireless Networks: Cooperative Learning Mechanism Using Non-IID Data
- Ask to upload some data from client to server Efficient Training Management for Mobile Crowd-Machine Learning: A Deep Reinforcement Learning Approach
- Reward function: accumulated data, energy consumption, training accuracy Fair Resource Allocation in Federated Learning
- Low-latency Broadband Analog Aggregation For Federated Edge Learning
- Federated Learning over Wireless Fading Channels
- Federated Learning via Over-the-Air Computation
- Asynchronous Federated Learning for Geospatial Applications [ECML PKDD Workshop 2018]
- Asynchronous Federated Optimization
- Adaptive Federated Learning in Resource Constrained Edge Computing Systems [IEEE Journal on Selected Areas in Communications, 2019]
- Incentive Mechanism for Reliable Federated Learning: A Joint Optimization Approach to Combining Reputation and Contract Theory
- Motivating Workers in Federated Learning: A Stackelberg Game Perspective
- Incentive Design for Efficient Federated Learning in Mobile Networks: A Contract Theory Approach [2019]
- Fair Resource Allocation in Federated Learning
- A Quasi-Newton Method Based Vertical Federated Learning Framework for Logistic Regression [NIPS 2019 Workshop]
- Can You Really Backdoor Federated Learning?
- Model Poisoning Attacks in Federated Learning [NIPS workshop 2018]
- Gradient-Leaks: Understanding and Controlling Deanonymization in Federated Learning [NIPS 2019 Workshop]
- Quantification of the Leakage in Federated Learning
- A Brief Introduction to Differential Privacy
- Deep Learning with Differential Privacy*
- Martin Abadi, Andy Chu, Ian Goodfellow, H. Brendan McMahan, Ilya Mironov, Kunal Talwar, and Li Zhang.
- Learning Differentially Private Recurrent Language Models
- Federated Learning with Bayesian Differential Privacy [NIPS 2019 Workshop]
- Private Federated Learning with Domain Adaptation [NIPS 2019 Workshop]
- cpSGD: Communication-efficient and differentially-private distributed SGD
- Federated Learning with Bayesian Differential Privacy [NIPS 2019 Workshop]
- Simple Introduction to Sharmir's Secret Sharing and Lagrange Interpolation
- Secret Sharing, Part 1: Shamir's Secret Sharing & Packed Variant
- Secret Sharing, Part 2: Improve efficiency
- Secret Sharing, Part 3
-
- A Tutorial for Encrypted Deep Learning
- Use Homomorphic Encryption (HE)
-
Private Deep Learning with MPC
- A Simple Tutorial from Scratch
- Use Multiparty Compuation (MPC)
-
Private Image Analysis with MPC
- Training CNNs on Sensitive Data
- Use SPDZ as MPC protocol
Helen: Maliciously Secure Coopetitive Learning for Linear Models (NIPS 2019 Workshop)
- Privacy-Preserving Deep Learning
- Privacy Partition: A Privacy-Preserving Framework for Deep Neural Networks in Edge Networks
- Practical Secure Aggregation for Privacy-Preserving Machine Learning (Google)
- Secure Aggregation: The problem of computing a multiparty sum where no party reveals its update in the clear—even to the aggregator
- Goal: securely computing sums of vectors, which has a constant number of rounds, low communication overhead, robustness to failures, and which requires only one server with limited trust
- Need to have basic knowledge of cryptographic algorithms such as secret sharing, key agreement, etc.
- Practical Secure Aggregation for Federated Learning on User-Held Data (Google)
- Highly related to Practical Secure Aggregation for Privacy-Preserving Machine Learning
- Proposed 4 protocol one by one with gradual improvement to meet the requirement of secure aggregation propocol.
- SecureML: A System for Scalable Privacy-Preserving Machine Learning
- Practical Secure Aggregation for Privacy-Preserving Machine Learning
- DeepSecure: Scalable Provably-Secure Deep Learning
- Chameleon: A Hybrid Secure Computation Framework for Machine Learning Applications
- Federated Reinforcement Learning
- Secure Federated Transfer Learning
- Federated Learning with Matched Averaging
- Bayesian Nonparametric Federated Learning of Neural Networks
- Multi-Center Federated Learning
- FedGAN: Federated Generative Adversarial Networks for Distributed Data
- Generative models for effective ML on private decentralized datasets
- FEDFMC: Sequential efficient Federated Learning on Non-IID data
- NIPS 2019 Workshop on Federated Learning for Data Privacy and Confidentiality 1
- NIPS 2019 Workshop on Federated Learning for Data Privacy and Confidentiality 2
- NIPS 2019 Workshop on Federated Learning for Data Privacy and Confidentiality 3
- Federated Learning Approach for Mobile Packet Classification
- Federated Learning for Ranking Browser History Suggestions [NIPS 2019 Workshop]
- HHHFL: Hierarchical Heterogeneous Horizontal Federated Learning for Electroencephalography [NIPS 2019 Workshop]
- Learn Electronic Health Records by Fully Decentralized Federated Learning [NIPS 2019 Workshop]
- Patient Clustering Improves Efficiency of Federated Machine Learning to predict mortality and hospital stay time using distributed Electronic Medical Records [News]
- MIT CSAI, Harvard Medical School, Tsinghua University
- Federated learning of predictive models from federated Electronic Health Records.
- Boston University, Massachusetts General Hospital
- FedHealth: A Federated Transfer Learning Framework for Wearable Healthcare
- Microsoft Research Asia
- Multi-Institutional Deep Learning Modeling Without Sharing Patient Data: A Feasibility Study on Brain Tumor Segmentation
- Intel
- NVIDIA Clara Federated Learning to Deliver AI to Hospitals While Protecting Patient Data
- Nvidia
- What is Federated Learning
- Nvidia
- Split learning for health: Distributed deep learning without sharing raw patient data
- Two-stage Federated Phenotyping and Patient Representation Learning [ACL 2019]
- Federated Tensor Factorization for Computational Phenotyping SIGKDD 2017
- FedHealth- A Federated Transfer Learning Framework for Wearable Healthcare [ICJAI19 workshop]
- Multi-Institutional Deep Learning Modeling Without Sharing Patient Data- A Feasibility Study on Brain Tumor Segmentation [MICCAI'18 Workshop]
- Federated Patient Hashing [AAAI'20]
- Federated Learning for Mobile Keyboard Prediction
- Applied Federated Learning: Improving Google Keyboard Query Suggestions
- Federated Learning Of Out-Of-Vocabulary Words
- Federated Learning for Emoji Prediction in a Mobile Keyboard
Snips
- Federated Learning for Wake Keyword Spotting](https://medium.com/snips-ai/federated-learning-for-wake-word-detection-c8b8c5cdd2c5)](https://github.com/snipsco/keyword-spotting-research-datasets)
- Performance Optimization for Federated Person Re-identification via Benchmark Analysis [ACMMM20]](https://github.com/cap-ntu/FedReID)
- Real-World Image Datasets for Federated Learning
- Webank & Extreme Vision
- FedVision- An Online Visual Object Detection Platform Powered by Federated Learning [IAAI20]
- Federated Learning for Vision-and-Language Grounding Problems [AAAI20]
- Federated Collaborative Filtering for Privacy-Preserving Personalized Recommendation System
- Huawei
- Federated Meta-Learning with Fast Convergence and Efficient Communication
- Huawei
- Turbofan POC: Predictive Maintenance of Turbofan Engines using Federated Learning
- Turbofan Tycoon Simulation by Cloudera/FastForwardLabs
- Firefox Search Bar
- Detail explaination of their implementationn of Federated Learning in production.