LightTag/sequence-labeling-with-transformers
Examples for aligning, padding and batching sequence labeling data (NER) for use with pre-trained transformer models
Jupyter NotebookMIT
Stargazers
- alekseiancherukParis
- baleatoAmsterdam, NL
- cedar33中国
- daviddwlee84@microsoft
- dimidd
- dlmacedoCIn
- egilronNorway
- eridgdPittsburgh, PA
- ErwinXuMicrosoft
- fly51flyPRIS
- GillesJBelgium
- hardikudeshi
- hml-ubt
- ineWsTut
- jeffmaxPhiladelphia, PA
- johann-petrak@sheffieldnlp, @GateNLP, @OFAI
- jsr-pSODAS
- JSv4@gunderson-dettmer
- kajyuuenLINE Corporation
- ken-dwyerOntada
- kzinmrTokyo, Japan
- leslyarun
- marrrcin
- melisa-writerQordoba
- nitinthewizAutoMedia
- oliverguhrImpact Labs GmbH
- parakalanSprinklr
- ruiantunesAveiro, Portugal
- sakares
- sanket9918Vellore Institute of Technology, Vellore
- SauceCatChina -> Singapore -> Tokyo
- SofianChay
- thak123University of Zagreb
- usuyamaMicrosoft Research, Health Futures
- vibhas-singhEpisource LLC
- WikidepiaIndonesia