Pinned Repositories
benjamin-mlr.github.io
camembert_finetune
CKA-Centered-Kernel-Alignment
Reproduce CKA: Similarity of Neural Network Representations Revisited
CompressedSensing
CS224_stanford_1
DataViz-ENSAE
DataViz course @ ENSAE ParisTech
fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
first-align-then-predict
first-align-then-predict-w-RANDOM-INIT
mbert-unseen-languages
benjamin-mlr's Repositories
benjamin-mlr/mbert-unseen-languages
benjamin-mlr/camembert_finetune
benjamin-mlr/first-align-then-predict
benjamin-mlr/benjamin-mlr.github.io
benjamin-mlr/fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
benjamin-mlr/first-align-then-predict-w-RANDOM-INIT
benjamin-mlr/CKA-Centered-Kernel-Alignment
Reproduce CKA: Similarity of Neural Network Representations Revisited
benjamin-mlr/CompressedSensing
benjamin-mlr/CS224_stanford_1
benjamin-mlr/DataViz-ENSAE
DataViz course @ ENSAE ParisTech
benjamin-mlr/DeepLearning
benjamin-mlr/generative_model_experimentation
benjamin-mlr/hugo-academic
📝 The website builder for Hugo. Build and deploy a beautiful website in minutes!
benjamin-mlr/Latent-Dirichlet-Allocation-Description-Implementation-Test
benjamin-mlr/lightning-language-modeling
Language Modeling Example with Transformers and PyTorch Lighting
benjamin-mlr/models
Models built with TensorFlow
benjamin-mlr/mrl-2023
benjamin-mlr/mt_norm_parse
benjamin-mlr/NER
benjamin-mlr/NeuroTaggerLex
benjamin-mlr/Optimization_for_DS
benjamin-mlr/overview
benjamin-mlr/paris-jo-data
benjamin-mlr/starter-hugo-academic
🎓 创建一个学术网站. Easily create a beautiful academic résumé or educational website using Hugo, GitHub, and Netlify.
benjamin-mlr/tensorflow
Computation using data flow graphs for scalable machine learning
benjamin-mlr/tools
Various utilities for processing the data.
benjamin-mlr/transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
benjamin-mlr/website
benjamin-mlr/XLM
PyTorch original implementation of Cross-lingual Language Model Pretraining.