- Federated Learning Comic
- Federated Learning: Collaborative Machine Learning without Centralized Training Data
- GDPR, Data Shotrage and AI (AAAI-19)
- Federated Learning: Machine Learning on Decentralized Data (Google I/O'19)
- Federated Learning White Paper V1.0
- Federated learning: distributed machine learning with data locality and privacy
- Federated Learning: Challenges, Methods, and Future Directions
- Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
- Federated Learning in Mobile Edge Networks: A Comprehensive Survey
- Federated Learning for Wireless Communications: Motivation, Opportunities and Challenges
- Convergence of Edge Computing and Deep Learning: A Comprehensive Survey
- Advances and Open Problems in Federated Learning
- Federated Machine Learning: Concept and Applications
- Threats to Federated Learning: A Survey
- Survey of Personalization Techniques for Federated Learning
- SECure: A Social and Environmental Certificate for AI Systems
- From Federated Learning to Fog Learning: Towards Large-Scale Distributed Machine Learning in Heterogeneous Wireless Networks
- Federated Learning for 6G Communications: Challenges, Methods, and Future Directions
- A Review of Privacy Preserving Federated Learning for Private IoT Analytics
- Towards Utilizing Unlabeled Data in Federated Learning: A Survey and Prospective
- Federated Learning for Resource-Constrained IoT Devices: Panoramas and State-of-the-art
- Privacy-Preserving Blockchain Based Federated Learning with Differential Data Sharing
- An Introduction to Communication Efficient Edge Machine Learning
- Federated Learning for Healthcare Informatics
- Federated Learning for Coalition Operations
- No Peek: A Survey of private distributed deep learning
- Communication-Efficient Edge AI: Algorithms and Systems
- LEAF: A Benchmark for Federated Settings(https://github.com/TalwalkarLab/leaf) [Recommend]
- A Performance Evaluation of Federated Learning Algorithms
- Edge AIBench: Towards Comprehensive End-to-end Edge Computing Benchmarking
- One-Shot Federated Learning
- Federated Learning with Unbiased Gradient Aggregation and Controllable Meta Updating (NIPS 2019 Workshop)
- Bayesian Nonparametric Federated Learning of Neural Networks (ICML 2019)
- FedBE: Making Bayesian Model Ensemble Applicable to Federated Learning (ICLR 2021)
- Agnostic Federated Learning (ICML 2019)
- Federated Learning with Matched Averaging (ICLR 2020)
- Astraea: Self-balancing federated learning for improving classification accuracy of mobile deep learning applications
- A Linear Speedup Analysis of Distributed Deep Learning with Sparse and Quantized Communication (NIPS 2018)
- Achieving Linear Speedup with Partial Worker Participation in Non-IID Federated Learning (ICLR 2021)
- FetchSGD: Communication-Efficient Federated Learning with Sketching
- FL-NTK: A Neural Tangent Kernel-based Framework for Federated Learning Convergence Analysis (ICML 2021)
- Federated Multi-armed Bandits with Personalization (AISTATS 2021)
- Federated Learning with Compression: Unified Analysis and Sharp Guarantees (AISTATS 2021)
- Convergence and Accuracy Trade-Offs in Federated Learning and Meta-Learning (AISTATS 2021)
- Towards Flexible Device Participation in Federated Learning (AISTATS 2021)
- Fed2: Feature-Aligned Federated Learning (KDD 2021)
- Federated Optimization for Heterogeneous Networks
- On the Convergence of FedAvg on Non-IID Data [OpenReview]
- Communication Efficient Decentralized Training with Multiple Local Updates
- Local SGD Converges Fast and Communicates Little
- SlowMo: Improving Communication-Efficient Distributed SGD with Slow Momentum
- Parallel Restarted SGD with Faster Convergence and Less Communication: Demystifying Why Model Averaging Works for Deep Learning (AAAI 2018)
- On the Linear Speedup Analysis of Communication Efficient Momentum SGD for Distributed Non-Convex Optimization (ICML 2019)
- Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data
- Convergence of Distributed Stochastic Variance Reduced Methods without Sampling Extra Data (NIPS 2019 Workshop)
- FedPD: A Federated Learning Framework with Optimal Rates andAdaptivity to Non-IID Data
- FedBN: Federated Learning on Non-IID Features via Local Batch Normalization (ICLR 2021)
- FedMix: Approximation of Mixup under Mean Augmented Federated Learning (ICLR 2021)
- HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients (ICLR 2021)
- FedRS: Federated Learning with Restricted Softmax for Label Distribution Non-IID Data (KDD 2021)
- FedMatch: Federated Learning Over Heterogeneous Question Answering Data (CIKM 2021)
- Decentralized Learning of Generative Adversarial Networks from Non-iid Data
- Towards Class Imbalance in Federated Learning
- Communication-Efficient On-Device Machine Learning:Federated Distillation and Augmentationunder Non-IID Private Data
- Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization
- Federated Adversarial Domain Adaptation
- Federated Learning with Only Positive Labels
- Federated Learning with Non-IID Data
- The Non-IID Data Quagmire of Decentralized Machine Learning
- Robust and Communication-Efficient Federated Learning from Non-IID Data (IEEE transactions on neural networks and learning systems)
- FedMD: Heterogenous Federated Learning via Model Distillation (NIPS 2019 Workshop)
- First Analysis of Local GD on Heterogeneous Data
- SCAFFOLD: Stochastic Controlled Averaging for On-Device Federated Learning
- Improving Federated Learning Personalization via Model Agnostic Meta Learning (NIPS 2019 Workshop)
- Personalized Federated Learning with First Order Model Optimization (ICLR 2021)
- LoAdaBoost: Loss-Based AdaBoost Federated Machine Learning on Medical Data
- On Federated Learning of Deep Networks from Non-IID Data: Parameter Divergence and the Effects of Hyperparametric Methods
- Overcoming Forgetting in Federated Learning on Non-IID Data (NIPS 2019 Workshop)
- FedMAX: Activation Entropy Maximization Targeting Effective Non-IID Federated Learning (NIPS 2019 Workshop)
- Adaptive Federated Optimization.(ICLR 2021 (Under Review))
- Stochastic, Distributed and Federated Optimization for Machine Learning. FL PhD Thesis. By Jakub
- Collaborative Deep Learning in Fixed Topology Networks
- FedCD: Improving Performance in non-IID Federated Learning.
- Life Long Learning: FedFMC: Sequential Efficient Federated Learning on Non-iid Data.
- Robust Federated Learning: The Case of Affine Distribution Shifts.
- Exploiting Shared Representations for Personalized Federated Learning (ICML 2021)
- Personalized Federated Learning using Hypernetworks (ICML 2021)
- Ditto: Fair and Robust Federated Learning Through Personalization (ICML 2021)
- Data-Free Knowledge Distillation for Heterogeneous Federated Learning (ICML 2021)
- Bias-Variance Reduced Local SGD for Less Heterogeneous Federated Learning (ICML 2021)
- Heterogeneity for the Win: One-Shot Federated Clustering (ICML 2021)
- Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning (ICML 2021)
- Federated Deep AUC Maximization for Hetergeneous Data with a Constant Communication Complexity (ICML 2021)
- Federated Learning of User Verification Models Without Sharing Embeddings (ICML 2021)
- One for One, or All for All: Equilibria and Optimality of Collaboration in Federated Learning (ICML 2021)
- Ensemble Distillation for Robust Model Fusion in Federated Learning.
- XOR Mixup: Privacy-Preserving Data Augmentation for One-Shot Federated Learning.
- An Efficient Framework for Clustered Federated Learning.
- Continual Local Training for Better Initialization of Federated Models.
- FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity to Non-IID Data.
- Global Multiclass Classification from Heterogeneous Local Models.
- Multi-Center Federated Learning.
- Federated Semi-Supervised Learning with Inter-Client Consistency & Disjoint Learning (ICLR 2021)
- (*) FedMAX: Mitigating Activation Divergence for Accurate and Communication-Efficient Federated Learning. CMU ECE.
- (*) Adaptive Personalized Federated Learning
- Semi-Federated Learning
- Device Heterogeneity in Federated Learning: A Superquantile Approach.
- Personalized Federated Learning for Intelligent IoT Applications: A Cloud-Edge based Framework
- Three Approaches for Personalization with Applications to Federated Learning
- Personalized Federated Learning: A Meta-Learning Approach
- Towards Federated Learning: Robustness Analytics to Data Heterogeneity
- Salvaging Federated Learning by Local Adaptation
- FOCUS: Dealing with Label Quality Disparity in Federated Learning.
- Overcoming Noisy and Irrelevant Data in Federated Learning.(ICPR 2020)
- Real-Time Edge Intelligence in the Making: A Collaborative Learning Framework via Federated Meta-Learning.
- (*) Think Locally, Act Globally: Federated Learning with Local and Global Representations. NeurIPS 2019 Workshop on Federated Learning distinguished student paper award
- Federated Learning with Personalization Layers
- Federated Evaluation of On-device Personalization
- Measure Contribution of Participants in Federated Learning
- (*) Measuring the Effects of Non-Identical Data Distribution for Federated Visual Classification
- Multi-hop Federated Private Data Augmentation with Sample Compression
- Distributed Training with Heterogeneous Data: Bridging Median- and Mean-Based Algorithms
- High Dimensional Restrictive Federated Model Selection with multi-objective Bayesian Optimization over shifted distributions
- Robust Federated Learning Through Representation Matching and Adaptive Hyper-parameters
- Towards Efficient Scheduling of Federated Mobile Devices under Computational and Statistical Heterogeneity
- Client Adaptation improves Federated Learning with Simulated Non-IID Clients
- Asynchronous Federated Learning for Geospatial Applications (ECML PKDD Workshop 2018)
- Asynchronous Federated Optimization
- Adaptive Federated Learning in Resource Constrained Edge Computing Systems (IEEE Journal on Selected Areas in Communications, 2019)
- The Distributed Discrete Gaussian Mechanism for Federated Learning with Secure Aggregation (ICML 2021)
- Can You Really Backdoor Federated Learning? (NeruIPS 2019)
- Model Poisoning Attacks in Federated Learning (NIPS workshop 2018)
- An Overview of Federated Deep Learning Privacy Attacks and Defensive Strategies.
- How To Backdoor Federated Learning.(AISTATS 2020)
- Deep Models Under the GAN: Information Leakage from Collaborative Deep Learning.(ACM CCS 2017)
- Byzantine-Robust Distributed Learning: Towards Optimal Statistical Rates
- Deep Leakage from Gradients.(NIPS 2019)
- Comprehensive Privacy Analysis of Deep Learning: Passive and Active White-box Inference Attacks against Centralized and Federated Learning.
- Beyond Inferring Class Representatives: User-Level Privacy Leakage From Federated Learning.(INFOCOM 2019)
- Analyzing Federated Learning through an Adversarial Lens.(ICML 2019)
- Mitigating Sybils in Federated Learning Poisoning.(RAID 2020)
- RSA: Byzantine-Robust Stochastic Aggregation Methods for Distributed Learning from Heterogeneous Datasets.(AAAI 2019)
- A Framework for Evaluating Gradient Leakage Attacks in Federated Learning.
- Local Model Poisoning Attacks to Byzantine-Robust Federated Learning.
- Backdoor Attacks on Federated Meta-Learning
- Towards Realistic Byzantine-Robust Federated Learning.
- Data Poisoning Attacks on Federated Machine Learning.
- Exploiting Defenses against GAN-Based Feature Inference Attacks in Federated Learning.
- Byzantine-Resilient High-Dimensional SGD with Local Iterations on Heterogeneous Data.
- FedMGDA+: Federated Learning meets Multi-objective Optimization.
- Free-rider Attacks on Model Aggregation in Federated Learning (AISTATS 2021)
- FDA3 : Federated Defense Against Adversarial Attacks for Cloud-Based IIoT Applications.
- Privacy-preserving Weighted Federated Learning within Oracle-Aided MPC Framework.
- BASGD: Buffered Asynchronous SGD for Byzantine Learning.
- Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees.
- Learning to Detect Malicious Clients for Robust Federated Learning.
- Robust Aggregation for Federated Learning.
- Towards Deep Federated Defenses Against Malware in Cloud Ecosystems.
- Attack-Resistant Federated Learning with Residual-based Reweighting.
- Free-riders in Federated Learning: Attacks and Defenses.
- Robust Federated Learning with Noisy Communication.
- Abnormal Client Behavior Detection in Federated Learning.
- Eavesdrop the Composition Proportion of Training Labels in Federated Learning.
- Byzantine-Robust Federated Machine Learning through Adaptive Model Averaging.
- An End-to-End Encrypted Neural Network for Gradient Updates Transmission in Federated Learning.
- Secure Distributed On-Device Learning Networks With Byzantine Adversaries.
- Robust Federated Training via Collaborative Machine Teaching using Trusted Instances.
- Dancing in the Dark: Private Multi-Party Machine Learning in an Untrusted Setting.
- Inverting Gradients - How easy is it to break privacy in federated learning?
- Gradient-Leaks: Understanding and Controlling Deanonymization in Federated Learning (NIPS 2019 Workshop)
- Quantification of the Leakage in Federated Learning
- Communication-Efficient Learning of Deep Networks from Decentralized Data](https://github.com/roxanneluo/Federated-Learning) [Google] [Must Read]
- Two-Stream Federated Learning: Reduce the Communication Costs (2018 IEEE VCIP)
- Federated Learning Based on Dynamic Regularization (ICLR 2021)
- Federated Learning via Posterior Averaging: A New Perspective and Practical Algorithms (ICLR 2021)
- Adaptive Federated Optimization (ICLR 2021)
- PowerSGD: Practical Low-Rank Gradient Compression for Distributed Optimization (NIPS 2019)
- Deep Gradient Compression: Reducing the Communication Bandwidth for Distributed Training (ICLR 2018)
- The Error-Feedback Framework: Better Rates for SGD with Delayed Gradients and Compressed Communication
- A Communication Efficient Collaborative Learning Framework for Distributed Features (NIPS 2019 Workshop)
- Active Federated Learning (NIPS 2019 Workshop)
- Communication-Efficient Distributed Optimization in Networks with Gradient Tracking and Variance Reduction (NIPS 2019 Workshop)
- Gradient Descent with Compressed Iterates (NIPS 2019 Workshop)
- LAG: Lazily Aggregated Gradient for Communication-Efficient Distributed Learning
- Exact Support Recovery in Federated Regression with One-shot Communication
- DEED: A General Quantization Scheme for Communication Efficiency in Bits
- Personalized Federated Learning with Moreau Envelopes
- Towards Flexible Device Participation in Federated Learning for Non-IID Data.
- A Primal-Dual SGD Algorithm for Distributed Nonconvex Optimization
- FedSplit: An algorithmic framework for fast federated optimization
- Distributed Stochastic Non-Convex Optimization: Momentum-Based Variance Reduction
- On the Outsized Importance of Learning Rates in Local Update Methods.
- Federated Learning with Compression: Unified Analysis and Sharp Guarantees.
- From Local SGD to Local Fixed-Point Methods for Federated Learning
- Federated Residual Learning.
- Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization.[ICML 2020]
- Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge (FedCS)
- Hybrid-FL for Wireless Networks: Cooperative Learning Mechanism Using Non-IID Data
- LASG: Lazily Aggregated Stochastic Gradients for Communication-Efficient Distributed Learning
- Uncertainty Principle for Communication Compression in Distributed and Federated Learning and the Search for an Optimal Compressor
- Dynamic Federated Learning
- Distributed Optimization over Block-Cyclic Data
- Federated Composite Optimization (ICML 2021)
- Distributed Non-Convex Optimization with Sublinear Speedup under Intermittent Client Availability
- Federated Learning of a Mixture of Global and Local Models
- Faster On-Device Training Using New Federated Momentum Algorithm
- FedDANE: A Federated Newton-Type Method
- Distributed Fixed Point Methods with Compressed Iterates
- Primal-dual methods for large-scale and distributed convex optimization and data analytics
- Parallel Restarted SPIDER - Communication Efficient Distributed Nonconvex Optimization with Optimal Computation Complexity
- Representation of Federated Learning via Worst-Case Robust Optimization Theory
- On the Convergence of Local Descent Methods in Federated Learning
- SCAFFOLD: Stochastic Controlled Averaging for Federated Learning
- Accelerating Federated Learning via Momentum Gradient Descent
- Robust Federated Learning in a Heterogeneous Environment
- Scalable and Differentially Private Distributed Aggregation in the Shuffled Model
- Differentially Private Learning with Adaptive Clipping
- Semi-Cyclic Stochastic Gradient Descent
- Federated Optimization in Heterogeneous Networks
- Partitioned Variational Inference: A unified framework encompassing federated and continual learning
- Learning Rate Adaptation for Federated and Differentially Private Learning
- Communication-Efficient Robust Federated Learning Over Heterogeneous Datasets
- Don’t Use Large Mini-Batches, Use Local SGD
- Overlap Local-SGD: An Algorithmic Approach to Hide Communication Delays in Distributed SGD
- Local SGD With a Communication Overhead Depending Only on the Number of Workers
- Federated Accelerated Stochastic Gradient Descent
- Tighter Theory for Local SGD on Identical and Heterogeneous Data
- STL-SGD: Speeding Up Local SGD with Stagewise Communication Period
- Cooperative SGD: A unified Framework for the Design and Analysis of Communication-Efficient SGD Algorithms
- Understanding Unintended Memorization in Federated Learning
- eSGD: Communication Efficient Distributed Deep Learning on the Edge (USENIX 2018 Workshop)
- CMFL: Mitigating Communication Overhead for Federated Learning
- Expanding the Reach of Federated Learning by Reducing Client Resource Requirements
- Federated Learning: Strategies for Improving Communication Efficiency (NIPS2016 Workshop) [Google]
- Natural Compression for Distributed Deep Learning
- FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
- ATOMO: Communication-efficient Learning via Atomic Sparsification(NIPS 2018)
- vqSGD: Vector Quantized Stochastic Gradient Descent
- QSGD: Communication-efficient SGD via gradient quantization and encoding (NIPS 2017)
- Federated Optimization: Distributed Machine Learning for On-Device Intelligence [Google]
- Distributed Mean Estimation with Limited Communication (ICML 2017)
- Randomized Distributed Mean Estimation: Accuracy vs Communication
- Error Feedback Fixes SignSGD and other Gradient Compression Schemes (ICML 2019)
- ZipML: Training Linear Models with End-to-End Low Precision, and a Little Bit of Deep Learning (ICML 2017)
- Federated Meta-Learning with Fast Convergence and Efficient Communication
- Federated Meta-Learning for Recommendation
- Adaptive Gradient-Based Meta-Learning Methods
- MOCHA: Federated Multi-Task Learning (NIPS 2017)
- Variational Federated Multi-Task Learning
- Federated Kernelized Multi-Task Learning
- Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints (NIPS 2019 Workshop)
- Local Stochastic Approximation: A Unified View of Federated Learning and Distributed Multi-Task Reinforcement Learning Algorithms
- Client-Edge-Cloud Hierarchical Federated Learning
- (FL startup: Tongdun, HangZhou, China) Knowledge Federation: A Unified and Hierarchical Privacy-Preserving AI Framework.
- HFEL: Joint Edge Association and Resource Allocation for Cost-Efficient Hierarchical Federated Edge Learning
- Hierarchical Federated Learning Across Heterogeneous Cellular Networks
- Enhancing Privacy via Hierarchical Federated Learning
- Federated learning with hierarchical clustering of local updates to improve training on non-IID data.
- Federated Hierarchical Hybrid Networks for Clickbait Detection
- Secure Federated Transfer Learning. IEEE Intelligent Systems 2018.
- Secure and Efficient Federated Transfer Learning
- Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data
- Proxy Experience Replay: Federated Distillation for Distributed Reinforcement Learning.
- Cooperative Learning via Federated Distillation over Fading Channels
- (*) Cronus: Robust and Heterogeneous Collaborative Learning with Black-Box Knowledge Transfer
- Federated Reinforcement Distillation with Proxy Experience Memory
- Federated Continual Learning with Weighted Inter-client Transfer (ICML 2021)
- Communication Compression for Decentralized Training (NIPS 2018)
- 𝙳𝚎𝚎𝚙𝚂𝚚𝚞𝚎𝚎𝚣𝚎: Decentralization Meets Error-Compensated Compression
- Central Server Free Federated Learning over Single-sided Trust Social Networks
- Can Decentralized Algorithms Outperform Centralized Algorithms? A Case Study for Decentralized Parallel Stochastic Gradient Descent
- Multi-consensus Decentralized Accelerated Gradient Descent
- Decentralized Bayesian Learning over Graphs.
- BrainTorrent: A Peer-to-Peer Environment for Decentralized Federated Learning
- Biscotti: A Ledger for Private and Secure Peer-to-Peer Machine Learning
- Matcha: Speeding Up Decentralized SGD via Matching Decomposition Sampling
- Incentive Mechanism for Reliable Federated Learning: A Joint Optimization Approach to Combining Reputation and Contract Theory
- Towards Fair Federated Learning (KDD 2021)
- Federated Adversarial Debiasing for Fair and Transferable Representations (KDD 2021)
- Motivating Workers in Federated Learning: A Stackelberg Game Perspective
- Incentive Design for Efficient Federated Learning in Mobile Networks: A Contract Theory Approach
- Fair Resource Allocation in Federated Learning
- FMore: An Incentive Scheme of Multi-dimensional Auction for Federated Learning in MEC.(ICDCS 2020)
- Toward an Automated Auction Framework for Wireless Federated Learning Services Market
- Federated Learning for Edge Networks: Resource Optimization and Incentive Mechanism
- A Learning-based Incentive Mechanism forFederated Learning
- A Crowdsourcing Framework for On-Device Federated Learning
- Rewarding High-Quality Data via Influence Functions
- Joint Service Pricing and Cooperative Relay Communication for Federated Learning
- Measure Contribution of Participants in Federated Learning
- DeepChain: Auditable and Privacy-Preserving Deep Learning with Blockchain-based Incentive
- A Quasi-Newton Method Based Vertical Federated Learning Framework for Logistic Regression (NIPS 2019 Workshop)
- SecureBoost: A Lossless Federated Learning Framework
- Parallel Distributed Logistic Regression for Vertical Federated Learning without Third-Party Coordinator
- AsySQN: Faster Vertical Federated Learning Algorithms with Better Computation Resource Utilization (KDD 2021)
- Large-scale Secure XGB for Vertical Federated Learning (CIKM 2021)
- Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption
- Entity Resolution and Federated Learning get a Federated Resolution.
- Multi-Participant Multi-Class Vertical Federated Learning
- A Communication-Efficient Collaborative Learning Framework for Distributed Features
- Asymmetrical Vertical Federated Learning
- VAFL: a Method of Vertical Asynchronous Federated Learning (ICML workshop on FL, 2020)
- SplitFed: When Federated Learning Meets Split Learning
- Privacy Enhanced Multimodal Neural Representations for Emotion Recognition
- PrivyNet: A Flexible Framework for Privacy-Preserving Deep Neural Network Training
- One Pixel Image and RF Signal Based Split Learning for mmWave Received Power Prediction
- Stochastic Distributed Optimization for Machine Learning from Decentralized Features
- Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup
- Wireless Communications for Collaborative Federated Learning in the Internet of Things
- Democratizing the Edge: A Pervasive Edge Computing Framework
- UVeQFed: Universal Vector Quantization for Federated Learning
- Federated Deep Learning Framework For Hybrid Beamforming in mm-Wave Massive MIMO
- Efficient Federated Learning over Multiple Access Channel with Differential Privacy Constraints
- A Secure Federated Learning Framework for 5G Networks
- Federated Learning and Wireless Communications
- Lightwave Power Transfer for Federated Learning-based Wireless Networks
- Towards Ubiquitous AI in 6G with Federated Learning
- Optimizing Over-the-Air Computation in IRS-Aided C-RAN Systems
- Network-Aware Optimization of Distributed Learning for Fog Computing
- On the Design of Communication Efficient Federated Learning over Wireless Networks
- Federated Machine Learning for Intelligent IoT via Reconfigurable Intelligent Surface
- Client Selection and Bandwidth Allocation in Wireless Federated Learning Networks: A Long-Term Perspective
- Resource Management for Blockchain-enabled Federated Learning: A Deep Reinforcement Learning Approach
- A Blockchain-based Decentralized Federated Learning Framework with Committee Consensus
- Scheduling for Cellular Federated Edge Learning with Importance and Channel.
- Differentially Private Federated Learning for Resource-Constrained Internet of Things.
- Federated Learning for Task and Resource Allocation in Wireless High Altitude Balloon Networks.
- Gradient Estimation for Federated Learning over Massive MIMO Communication Systems
- Adaptive Federated Learning With Gradient Compression in Uplink NOMA
- Performance Analysis and Optimization in Privacy-Preserving Federated Learning
- Energy-Efficient Federated Edge Learning with Joint Communication and Computation Design
- Federated Over-the-Air Subspace Learning and Tracking from Incomplete Data
- Decentralized Federated Learning via SGD over Wireless D2D Networks
- Federated Learning in the Sky: Joint Power Allocation and Scheduling with UAV Swarms
- Wireless Federated Learning with Local Differential Privacy
- Federated Learning under Channel Uncertainty: Joint Client Scheduling and Resource Allocation.
- Learning from Peers at the Wireless Edge
- Convergence of Update Aware Device Scheduling for Federated Learning at the Wireless Edge
- Communication Efficient Federated Learning over Multiple Access Channels
- Convergence Time Optimization for Federated Learning over Wireless Networks
- One-Bit Over-the-Air Aggregation for Communication-Efficient Federated Edge Learning: Design and Convergence Analysis
- Federated Learning with Cooperating Devices: A Consensus Approach for Massive IoT Networks.(IEEE Internet of Things Journal. 2020)
- Asynchronous Federated Learning with Differential Privacy for Edge Intelligence
- Federated learning with multichannel ALOHA
- Federated Learning with Autotuned Communication-Efficient Secure Aggregation
- Bandwidth Slicing to Boost Federated Learning in Edge Computing
- Energy Efficient Federated Learning Over Wireless Communication Networks
- Device Scheduling with Fast Convergence for Wireless Federated Learning
- Energy-Aware Analog Aggregation for Federated Learning with Redundant Data
- Age-Based Scheduling Policy for Federated Learning in Mobile Edge Networks
- Federated Learning over Wireless Networks: Convergence Analysis and Resource Allocation
- Federated Learning over Wireless Networks: Optimization Model Design and Analysis
- Resource Allocation in Mobility-Aware Federated Learning Networks: A Deep Reinforcement Learning Approach
- Reliable Federated Learning for Mobile Networks
- Cell-Free Massive MIMO for Wireless Federated Learning
- A Joint Learning and Communications Framework for Federated Learning over Wireless Networks
- On Safeguarding Privacy and Security in the Framework of Federated Learning
- Scheduling Policies for Federated Learning in Wireless Networks
- Federated Learning with Additional Mechanisms on Clients to Reduce Communication Costs
- Energy-Efficient Radio Resource Allocation for Federated Edge Learning
- Mobile Edge Computing, Blockchain and Reputation-based Crowdsourcing IoT Federated Learning: A Secure, Decentralized and Privacy-preserving System
- Active Learning Solution on Distributed Edge Computing
- Fast Uplink Grant for NOMA: a Federated Learning based Approach
- Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air
- Broadband Analog Aggregation for Low-Latency Federated Edge Learning
- Federated Echo State Learning for Minimizing Breaks in Presence in Wireless Virtual Reality Networks
- Joint Service Pricing and Cooperative Relay Communication for Federated Learning
- In-Edge AI: Intelligentizing Mobile Edge Computing, Caching and Communication by Federated Learning
- Asynchronous Task Allocation for Federated and Parallelized Mobile Edge Learning
- Ask to upload some data from client to server Efficient Training Management for Mobile Crowd-Machine Learning: A Deep Reinforcement Learning Approach
- Low-latency Broadband Analog Aggregation For Federated Edge Learning
- Federated Learning over Wireless Fading Channels
- Federated Learning via Over-the-Air Computation
- FedNAS: Federated Deep Learning via Neural Architecture Search.(CVPR 2020)
- Real-time Federated Evolutionary Neural Architecture Search.
- Federated Neural Architecture Search.
- Differentially-private Federated Neural Architecture Search.
- SGNN: A Graph Neural Network Based Federated Learning Approach by Hiding Structure (Big Data)
- GraphFederator: Federated Visual Analysis for Multi-party Graphs.
- FedE: Embedding Knowledge Graphs in Federated Setting
- ASFGNN: Automated Separated-Federated Graph Neural Network
- GraphFL: A Federated Learning Framework for Semi-Supervised Node Classification on Graphs
- Peer-to-peer Federated Learning on Graphs
- Towards Federated Graph Learning for Collaborative Financial Crimes Detection
- Secure Deep Graph Generation with Link Differential Privacy (IJCAI 2021)
- Locally Private Graph Neural Networks (CCS 2021)
- When Differential Privacy Meets Graph Neural Networks
- Releasing Graph Neural Networks with Differential Privacy
- Vertically Federated Graph Neural Network for Privacy-Preserving Node Classification
- FedGNN: Federated Graph Neural Network for Privacy-Preserving Recommendation (ICML 2021)
- Decentralized Federated Graph Neural Networks (IJCAI 2021)
- Federated Graph Classification over Non-IID Graphs (NeurIPS 2021)
- SpreadGNN: Serverless Multi-task Federated Learning for Graph Neural Networks (ICML 2021)
- FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks (ICLR 2021)
- Cross-Node Federated Graph Neural Network for Spatio-Temporal Data Modeling (KDD 2021)
- Towards Federated Learning at Scale: System Design [Must Read]
- Scaling Distributed Machine Learning with System and Algorithm Co-design
- Demonstration of Federated Learning in a Resource-Constrained Networked Environment
- Applied Federated Learning: Improving Google Keyboard Query Suggestions
- Federated Learning and Differential Privacy: Software tools analysis, the Sherpa.ai FL framework and methodological guidelines for preserving data privacy
- FedML: A Research Library and Benchmark for Federated Machine Learning
- FLeet: Online Federated Learning via Staleness Awareness and Performance Prediction.
- Heterogeneity-Aware Federated Learning
- Decentralised Learning from Independent Multi-Domain Labels for Person Re-Identification
- [startup] Industrial Federated Learning -- Requirements and System Design
- (*) TiFL: A Tier-based Federated Learning System.(HPDC 2020)
- Adaptive Gradient Sparsification for Efficient Federated Learning: An Online Learning Approach(ICDCS 2020)
- Quantifying the Performance of Federated Transfer Learning
- ELFISH: Resource-Aware Federated Learning on Heterogeneous Edge Devices
- Privacy is What We Care About: Experimental Investigation of Federated Learning on Edge Devices
- Substra: a framework for privacy-preserving, traceable and collaborative Machine Learning
- BAFFLE : Blockchain Based Aggregator Free Federated Learning
- Functional Federated Learning in Erlang (ffl-erl)
- HierTrain: Fast Hierarchical Edge AI Learning With Hybrid Parallelism in Mobile-Edge-Cloud Computing
- Orpheus: Efficient Distributed Machine Learning via System and Algorithm Co-design
- Scalable Distributed DNN Training using TensorFlow and CUDA-Aware MPI: Characterization, Designs, and Performance Evaluation
- Optimized Broadcast for Deep Learning Workloads on Dense-GPU InfiniBand Clusters: MPI or NCCL?
- Optimizing Network Performance for Distributed DNN Training on GPU Clusters: ImageNet/AlexNet Training in 1.5 Minutes
-
- A Tutorial for Encrypted Deep Learning
- Use Homomorphic Encryption (HE)
-
Private Deep Learning with MPC
- A Simple Tutorial from Scratch
- Use Multiparty Compuation (MPC)
-
Private Image Analysis with MPC
- Training CNNs on Sensitive Data
- Use SPDZ as MPC protocol
- Simple Introduction to Sharmir's Secret Sharing and Lagrange Interpolation
- Secret Sharing, Part 1: Shamir's Secret Sharing & Packed Variant
- Secret Sharing, Part 2: Improve efficiency
- Secret Sharing, Part 3
- Learning Differentially Private Recurrent Language Models
- Federated Learning with Bayesian Differential Privacy (NIPS 2019 Workshop)
- Private Federated Learning with Domain Adaptation (NIPS 2019 Workshop)
- cpSGD: Communication-efficient and differentially-private distributed SGD
- Practical Secure Aggregation for Federated Learning on User-Held Data.(NIPS 2016 Workshop)
- Differentially Private Federated Learning: A Client Level Perspective.(NIPS 2017 Workshop)
- Exploiting Unintended Feature Leakage in Collaborative Learning.(S&P 2019)
- A Hybrid Approach to Privacy-Preserving Federated Learning. (AISec 2019)
- A generic framework for privacy preserving deep learning. (PPML 2018)
- Federated Generative Privacy.(IJCAI 2019 FL Workshop)
- Enhancing the Privacy of Federated Learning with Sketching.
- https://aisec.cc/
- Federated f-Differential Privacy (AISTATS 2021)
- Shuffled Model of Differential Privacy in Federated Learning (AISTATS 2021)
- Differentially Private Federated Knowledge Graphs Embedding (CIKM 2021)
- Anonymizing Data for Privacy-Preserving Federated Learning.
- Practical and Bilateral Privacy-preserving Federated Learning.
- Decentralized Policy-Based Private Analytics.
- FedSel: Federated SGD under Local Differential Privacy with Top-k Dimension Selection. (DASFAA 2020)
- Learn to Forget: User-Level Memorization Elimination in Federated Learning.
- LDP-Fed: Federated Learning with Local Differential Privacy.(EdgeSys 2020)
- PrivFL: Practical Privacy-preserving Federated Regressions on High-dimensional Data over Mobile Networks.
- Local Differential Privacy based Federated Learning for Internet of Things.
- Differentially Private AirComp Federated Learning with Power Adaptation Harnessing Receiver Noise.
- Decentralized Differentially Private Segmentation with PATE.(MICCAI 2020 Under Review)
- Privacy Preserving Distributed Machine Learning with Federated Learning.
- Exploring Private Federated Learning with Laplacian Smoothing.
- Information-Theoretic Bounds on the Generalization Error and Privacy Leakage in Federated Learning.
- Efficient Privacy Preserving Edge Computing Framework for Image Classification.
- A Distributed Trust Framework for Privacy-Preserving Machine Learning.
- Secure Byzantine-Robust Machine Learning.
- ARIANN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing.
- Privacy For Free: Wireless Federated Learning Via Uncoded Transmission With Adaptive Power Control.
- (*) Distributed Differentially Private Averaging with Improved Utility and Robustness to Malicious Parties.
- GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators.
- Federated Learning with Differential Privacy:Algorithms and Performance Analysis
- Simple Introduction to Sharmir's Secret Sharing and Lagrange Interpolation
- Secret Sharing, Part 1: Shamir's Secret Sharing & Packed Variant
- Secret Sharing, Part 2: Improve efficiency
- Secret Sharing, Part 3
- Federated Learning Approach for Mobile Packet Classification
- Federated Learning for Ranking Browser History Suggestions (NIPS 2019 Workshop)
- HHHFL: Hierarchical Heterogeneous Horizontal Federated Learning for Electroencephalography (NIPS 2019 Workshop)
- Learn Electronic Health Records by Fully Decentralized Federated Learning (NIPS 2019 Workshop)
- FLOP: Federated Learning on Medical Datasets using Partial Networks (KDD 2021)
- Patient Clustering Improves Efficiency of Federated Machine Learning to predict mortality and hospital stay time using distributed Electronic Medical Records [News]
- Federated learning of predictive models from federated Electronic Health Records.
- FedHealth: A Federated Transfer Learning Framework for Wearable Healthcare
- Multi-Institutional Deep Learning Modeling Without Sharing Patient Data: A Feasibility Study on Brain Tumor Segmentation
- NVIDIA Clara Federated Learning to Deliver AI to Hospitals While Protecting Patient Data
- What is Federated Learning
- Split learning for health: Distributed deep learning without sharing raw patient data
- Two-stage Federated Phenotyping and Patient Representation Learning (ACL 2019)
- Federated Tensor Factorization for Computational Phenotyping (SIGKDD 2017)
- FedHealth- A Federated Transfer Learning Framework for Wearable Healthcare (ICJAI 2019 workshop)
- Multi-Institutional Deep Learning Modeling Without Sharing Patient Data- A Feasibility Study on Brain Tumor Segmentation (MICCAI'18 Workshop)
- Federated Patient Hashing (AAAI 2020)
- Federated Learning for Mobile Keyboard Prediction
- Applied Federated Learning: Improving Google Keyboard Query Suggestions
- Federated Learning Of Out-Of-Vocabulary Words
- Federated Learning for Emoji Prediction in a Mobile Keyboard
Snips
- Performance Optimization for Federated Person Re-identification via Benchmark Analysis (ACMMM 2020) [Github]
- Real-World Image Datasets for Federated Learning
- FedVision- An Online Visual Object Detection Platform Powered by Federated Learning (IAAI20)
- Federated Learning for Vision-and-Language Grounding Problems (AAAI20)
- Federated Collaborative Filtering for Privacy-Preserving Personalized Recommendation System
- Federated Meta-Learning with Fast Convergence and Efficient Communication
- Secure Federated Matrix Factorization
- DiFacto: Distributed Factorization Machines
- Turbofan POC: Predictive Maintenance of Turbofan Engines using Federated Learning
- Turbofan Tycoon Simulation by Cloudera/FastForwardLabs
- Firefox Search Bar
微众银行开源 FATE 框架.
Qiang Yang, Tianjian Chen, Yang Liu, Yongxin Tong.
- 《Federated machine learning: Concept and applications》
- 《Secureboost: A lossless federated learning framework》
字节跳动开源 FedLearner 框架.
Jiankai Sun, Weihao Gao, Hongyi Zhang, Junyuan Xie.《Label Leakage and Protection in Two-party Split learning》
Yi Li, Wei Xu.《PrivPy: General and Scalable Privacy-Preserving Data Mining》
Hongyu Li, Dan Meng, Hong Wang, Xiaolin Li.
- 《Knowledge Federation: A Unified and Hierarchical Privacy-Preserving AI Framework》
- 《FedMONN: Meta Operation Neural Network for Secure Federated Aggregation》
百度 MesaTEE 安全计算平台
Tongxin Li, Yu Ding, Yulong Zhang, Tao Wei.《gbdt-rs: Fast and Trustworthy Gradient Boosting Decision Tree》
矩阵元 Rosetta 隐私开源框架
百度 PaddlePaddle 开源联邦学习框架
蚂蚁区块链科技 蚂蚁链摩斯安全计算平台
阿里云 DataTrust 隐私增强计算平台
《FedVision: An Online Visual Object Detection Platform Powered by Federated Learning》
《BatchCrypt: Efficient Homomorphic Encryption for Cross-Silo Federated Learning》
《Abnormal Client Behavior Detection in Federated Learning》
《Federated machine learning: Concept and applications》
《Failure Prediction in Production Line Based on Federated Learning: An Empirical Study》
Google 提出 Federated Learning. H. Brendan McMahan. Daniel Ramage. Jakub Konečný. Kallista A. Bonawitz. Hubert Eichner.
《Communication-efficient learning of deep networks from decentralized data》
《Federated Learning: Strategies for Improving Communication Efficiency》
《Advances and Open Problems in Federated Learning》
《Towards Federated Learning at Scale: System Design》
《Differentially Private Learning with Adaptive Clipping》
......(更多联邦学习相关文章请自行搜索 Google Scholar)
Antonio Marcedone.
《Practical Secure Aggregation for Federated Learning on User-Held Data》
《Practical Secure Aggregation for Privacy-Preserving Machine Learning》
Eugene Bagdasaryan, Andreas Veit, Yiqing Hua, Deborah Estrin, Vitaly Shmatikov.
《How To Backdoor Federated Learning》
《Differential privacy has disparate impact on model accuracy》
Ziteng Sun.
《Can you really backdoor federated learning?》
A unified approach to federated learning, analytics, and evaluation. Federate any workload, any ML framework, and any programming language.