giulio98's Stars
huggingface/accelerate
🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
rtaiello/crypten_lr_clear_model
salesforce/jaxformer
Minimal library to train LLMs on TPU in JAX with pjit().
huggingface/trl
Train transformer language models with reinforcement learning.
deepklarity/jupyter-text2code
A proof-of-concept jupyter extension which converts english queries into relevant python code
NielsRogge/Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
reddy-lab-code-research/XLCoST
Code and data for XLCoST: A Benchmark Dataset for Cross-lingual Code Intelligence
wlav/cppyy
karpathy/minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
rtaiello/distributed_common_modulus
Implementation of Sections 3.1, 3.2 and 3.3 of "An implementation of the Paillier crypto system with threshold decryption without a trusted dealer"
rtaiello/fds_2020
Final Project - Master's course of Fundamentals of Data Science a.y. 2019/20
wandb/wandb
The AI developer platform. Use Weights & Biases to train and fine-tune models, and manage models from experimentation to production.
hendrycks/apps
APPS: Automated Programming Progress Standard (NeurIPS 2021)
microsoft/CodeBERT
CodeBERT
open-mpi/ompi
Open MPI main development repository
pytorch/pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
microsoft/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
salesforce/CodeT5
Home of CodeT5: Open Code LLMs for Code Understanding and Generation
microsoft/CodeXGLUE
CodeXGLUE
salesforce/CodeRL
This is the official code for the paper CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning (NeurIPS22).
CodedotAl/gpt-code-clippy
Full description can be found here: https://discuss.huggingface.co/t/pretrain-gpt-neo-for-open-source-github-copilot-model/7678?u=ncoop57
huggingface/course
The Hugging Face course on Transformers
huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
salesforce/CodeGen
CodeGen is a family of open-source model for program synthesis. Trained on TPU-v4. Competitive with OpenAI Codex.
janishar/mit-deep-learning-book-pdf
MIT Deep Learning Book in PDF format (complete and parts) by Ian Goodfellow, Yoshua Bengio and Aaron Courville
rtaiello/pp_image_registration
Privacy Preserving Image Registration
rtaiello/asi_2022
This repository containts the labs and the final homewrok for the University course Advanced Statistical Inference @ Eurecom
rtaiello/big-data-taxi-fare-amount
BigData: The goal is to be able to create a model capable of predicting the taxi fare in New York.
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. Supports computation on CPU and GPU.
optuna/optuna-examples
Examples for https://github.com/optuna/optuna