Lguyogiro
Computational Linguist. Interested in language technology and Mesoamerican languages.
Indiana University, Bloomington
Lguyogiro's Stars
ai-forever/sage
SAGE: Spelling correction, corruption and evaluation for multiple languages
ctaguchi/killkan
Kichwa ASR dataset and code
hfst/hfst-optimized-lookup
HFST optimized-lookup standalone library and command line tool
cisnlp/simalign
Obtain Word Alignments using Pretrained Language Models (e.g., mBERT)
bigscience-workshop/data-preparation
Code used for sourcing and cleaning the BigScience ROOTS corpus
yzhangcs/parser
:rocket: State-of-the-art parsers for natural language.
kentonl/e2e-coref
End-to-end Neural Coreference Resolution
UniversalDependencies/UD_English-EWT
English data
machamp-nlp/machamp
Repository with code for MaChAmp: https://aclanthology.org/2021.eacl-demos.22/
apertium/apertium-skr
Apertium linguistic data for Saraiki
neural-polysynthetic-language-modelling/iiksiin
Deterministically constructs a sequence of morpheme tensors from a word using Tensor Product Representation
Lightning-AI/pytorch-lightning
Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
jessevig/bertviz
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
mravanelli/SincNet
SincNet is a neural architecture for efficiently processing raw audio samples.
bentrevett/pytorch-seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
dspencer12/motif-extraction
A motif extraction tool based on rmotifx and motif-x
amir-zeldes/xrenner
eXternally configurable REference and Non Named Entity Recognizer
ethch18/parsing-mbert
Code for "Parsing with Multilingual BERT, a Small Corpus, and a Small Treebank" by Ethan C. Chau, Lucy H. Lin, and Noah A. Smith
wasiahmad/cross_lingual_parsing
Official code for our CoNLL 2019 paper on Cross-lingual Dependency Parsing with Unlabeled Auxiliary Languages
Unipisa/diaparser
Direct Attentive Dependency Parser
mstrise/dep2label-bert
Dependency Parsing as Sequence Labeling with BERT
Edresson/YourTTS
YourTTS: Towards Zero-Shot Multi-Speaker TTS and Zero-Shot Voice Conversion for everyone
Sangramsingkayte/Speech-Synthesis-System
Language is the structural form of sharing thoughts and emotions in humans. The research motivates to stroke up for the Human-computer interaction. The overall intention of my PhD research program is focused to design Concatenation and Hidden Markov Model (HMM) based speech synthesis for the Marathi language. This will facilitate to correspond to the system and extend the technology for assertive devices based on the Marathi language. The advantage and attractive feature of the HMM system are that the voice alteration can be performed without large databases. To understand the detailed study of Synthesis techniques, I have also implemented the system for Unit Selection method. The Marathi Talking calculator is published at Play store using the technique of concatenation. This calculator performs the basic arithmetic operations and additionally speaks out the numeral in Marathi as the key is pressed. The result box synthesis the voice and speaks out the result in Marathi with correct place value of digits. The weakness of USS is it requires a large database and at joins, the quality is affected. To overcome these issues, the study reveals the built-up of a system with a phonetic based approach for Marathi Language using Concatenation and HMM.
hellohaptik/multi-task-NLP
multi_task_NLP is a utility toolkit enabling NLP developers to easily train and infer a single model for multiple tasks.
keitakurita/practical-torchtext
A set of tutorials for torchtext
chrisbangun/pytorch-seq2seq_with_attention
Paper Implementation about Attention Mechanism in Neural Network
speechbrain/speechbrain
A PyTorch-based Speech Toolkit
alisafaya/Arabic-BERT
Arabic edition of BERT pretrained language models
VMijangos/Redes_Neuronales
Repositorio del curso de Redes neuronales, en la Facultad de Ciencias, UNAM.
gucorpling/midas-loop