line/LINE-DistilBERT-Japanese
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
Apache-2.0
Stargazers
- altescyCookpad Inc.
- ayutazJapan
- botza69Thailand
- cacaoMath
- GitHub30Osaka, Japan
- heistakTokyo, Japan
- hisanoNTT Resonant Technology Inc.
- k4simaJapan
- kai0310Head of @dele-work, and a member of @DigitalNatureGroup, @matsuolab, and @matsuoinstitute.
- kajyuuenLINE Corporation
- kamosika179Japan
- kampersandaLegalOn Technologies, Inc.
- KiyotakaMatsushitaUNIQUEX, inc.
- kkadowaTokyo, Japan
- kurehajime@pepabo
- m3at
- May-KiriharaHarakiri-Works
- Mindful@Oracle
- musaprgLY Corporation (@LINE)
- nkmr-jpTokyo, Japan
- nzws@manabo
- resnantTokyo, Japan
- rishabh135Japan
- ryunakaipluslus
- shienaTokyo, Japan
- skzwksk
- SlashNephy@hatena
- sumibi-yakitoriJapan
- TakanariShimboNiigata, Japan
- takumi1001FusionCompLab, University of Tsukuba
- tsukumijimaDo It Yourself!!
- xiuposJapan
- xmetaMETA SQUARE
- y-mif
- ydaikaiTokyo
- yusuke-1105Kansai University