GerrySant
Now: PhD Candidate at @UZH_en - Before: AI Researcher at @Stanford, @la_UPC & @BSC_CNS - Master’s & Bachelor's on Telecommunications Engineering at @la_UPC
Barcelona
Pinned Repositories
TTS
🐸💬 - a deep learning toolkit for Text-to-Speech, battle-tested in research and production
ChaLearn-AUTSL-Challenge
fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
how2sign.github.io
Project page for the How2Sign dataset
multimodalhugs
MultimodalHugs is an extension of Hugging Face that offers a generalized framework for training, evaluating, and using multimodal AI models with minimal code differences, ensuring seamless compatibility with Hugging Face pipelines.
NeMo
NeMo: a toolkit for conversational AI
OVNET_VM_instalation_on_mac_mchips
Trainer
🐸 - A general purpose model trainer, as flexible as it gets
VITS_finetuned
VITS model pretrained in VCTK and finetuned with male English voice dataset with Indian accent from the Arctic dataset.
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
GerrySant's Repositories
GerrySant/VITS_finetuned
VITS model pretrained in VCTK and finetuned with male English voice dataset with Indian accent from the Arctic dataset.
GerrySant/ChaLearn-AUTSL-Challenge
GerrySant/fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
GerrySant/how2sign.github.io
Project page for the How2Sign dataset
GerrySant/multimodalhugs
MultimodalHugs is an extension of Hugging Face that offers a generalized framework for training, evaluating, and using multimodal AI models with minimal code differences, ensuring seamless compatibility with Hugging Face pipelines.
GerrySant/NeMo
NeMo: a toolkit for conversational AI
GerrySant/OVNET_VM_instalation_on_mac_mchips
GerrySant/Trainer
🐸 - A general purpose model trainer, as flexible as it gets
GerrySant/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
GerrySant/TTS
🐸💬 - a deep learning toolkit for Text-to-Speech, battle-tested in research and production