distilling-the-knowledge

There are 6 repositories under distilling-the-knowledge topic.

  • Hanlard/Electra_CRF_NER

    We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation and multi-stage learning. The result is that we improve the recall rate of company names recognition task from 0.73 to 0.92 and get 4 times as fast as BERT-Bilstm-CRF model.

    Language:Python805314
  • pedroprates/mobile-face-net

    A face recognition system with low computational cost

    Language:Jupyter Notebook19526
  • Dryjelly/Face_Ear_Landmark_Detection

    tf-keras code of Face Ear Landmark Detection System (with Multi-Task Learning).

    Language:Jupyter Notebook17103
  • daspartho/DistillClassifier

    Easily generate synthetic data for classification tasks using LLMs

    Language:Python2100
  • huyz1117/Distilling-Neural-Network

    Implemen Distilling the knowledge in a neural network with TensorFlow

    Language:Python2102
  • kientiet/panda-kaggle

    This repo is all the code that I used to participant in Prostate Cancer Grade Assessment contest on Kaggle

    Language:Jupyter Notebook0200