xiyiyia's Stars
tailscale/tailscale
The easiest, most secure way to use WireGuard and 2FA.
dair-ai/ml-visuals
🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.
cmhungsteve/Awesome-Transformer-Attention
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
godweiyang/NN-CUDA-Example
Several simple examples for popular neural network toolkits calling custom CUDA operators.
holyshell/Books
Some special ebooks
TalwalkarLab/leaf
Leaf: A Benchmark for Federated Settings
lena-voita/the-story-of-heads
This is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the ACL 2021 paper "Analyzing Source and Target Contributions to NMT Predictions".
pmichel31415/are-16-heads-really-better-than-1
Code for the paper "Are Sixteen Heads Really Better than One?"
jeremy313/non-iid-dataset-for-personalized-federated-learning
Official implementation of "FL-WBC: Enhancing Robustness against Model Poisoning Attacks in Federated Learning from a Client Perspective".
asherliu/researchHOWTO
Slides about how to do research
ExplosiveYan/MultiHop-Reasoning
ChenfhCS/MoE
ExplosiveYan/BayesKGR
Daniei1/hundsun-study