dbmdz/bert-base-german-cased
maikshift opened this issue · 3 comments
Hello,
I have 2 short questions:
-
is it correct that the model 'distilbert-base-german-cased' (https://huggingface.co/distilbert-base-german-cased) was distilled from the model 'dbmdz/bert-base-german-cased' (https://huggingface.co/dbmdz/bert-base-german-cased)?
-
is there a paper on the 'dbmdz/bert-base-german-cased' and / or the 'distilbert-base-german-cased' (which can also be used for citation purposes)?
Thanks in advance!
Hi @maikshift ,
to answer your questions:
1.) yes, the distilbert-base-german-cased
was distilled with the dbmdz/bert-base-german-cased
model!
2.) unfortunately, there are no papers out there for these models. But there's a paper for (better and larger) German language models (GBERT and GELECTRA): https://aclanthology.org/2020.coling-main.598/
Thank you very much! :)
Oh, and you can find a bit more information about the German DistilBERT in this PR: