distilling-the-knowledge
There are 6 repositories under distilling-the-knowledge topic.
Hanlard/Electra_CRF_NER
We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation and multi-stage learning. The result is that we improve the recall rate of company names recognition task from 0.73 to 0.92 and get 4 times as fast as BERT-Bilstm-CRF model.
pedroprates/mobile-face-net
A face recognition system with low computational cost
Dryjelly/Face_Ear_Landmark_Detection
tf-keras code of Face Ear Landmark Detection System (with Multi-Task Learning).
daspartho/DistillClassifier
Easily generate synthetic data for classification tasks using LLMs
huyz1117/Distilling-Neural-Network
Implemen Distilling the knowledge in a neural network with TensorFlow
kientiet/panda-kaggle
This repo is all the code that I used to participant in Prostate Cancer Grade Assessment contest on Kaggle