Checkout the game at: FL-Interactive-Game
An interactive web-based demonstration that makes learning about Federated Learning fun! Train models across multiple clients, experiment with different parameters, and watch your global model improve - all without sharing raw data.
- Interactive Client Management: Control 5 independent clients training on MNIST data
- Real-time Visualization: Watch training progress with live accuracy plots
- Configurable Parameters:
- Per-client learning rates and epoch counts
- Communication dropout probability
- Data distribution settings (IID vs Non-IID)
- Model Aggregation: Combine client models into a stronger global model
- Responsive Design: Works seamlessly on both desktop and mobile devices
This game helps you understand:
- How Federated Learning works in practice
- Impact of different learning rates and training durations
- Effects of communication dropouts in distributed learning
- Differences between IID and Non-IID data distributions
- Model aggregation and its effects on global performance
-
Choose your data distribution (IID or Non-IID)
- IID: Balanced data
- Non-IID: Chaos mode
- Remember to refresh when switching - those sneaky old models like to hang around! 🔄
-
Configure each client:
- Adjust learning rates (0.00001 to 0.001)
- Set training epochs (5 to 15)
-
Set global parameters:
- Adjust communication dropout probability
- This simulates real-world network conditions
-
Start training:
- Train clients individually or all at once
- Watch the accuracy plots evolve
- Aggregate models to improve global performance
- Built with TensorFlow.js
- Uses MNIST dataset (preprocessed for FL setting)
- Neural Network Architecture:
- Input Layer: 784 neurons (28x28 images)
- Hidden Layer: 64 neurons, ReLU activation
- Output Layer: 10 neurons, Softmax activation
- Beat 90% accuracy on the global model
- Achieve consistent performance across all clients
- Maintain good performance with high dropout rates
- Master both IID and Non-IID scenarios
- Model states persist between data distribution switches (refresh required)
- Some mobile devices may experience performance lag with multiple clients
- High learning rates can cause training instability