pre-trained-model
There are 166 repositories under pre-trained-model topic.
microsoft/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
thunlp/OpenPrompt
An Open-Source Framework for Prompt-Learning.
brightmart/albert_zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
ChineseGLUE/ChineseGLUE
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
keyu-tian/SparK
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
ymcui/Chinese-ELECTRA
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
d-li14/involution
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
zjunlp/KnowLM
An Open-sourced Knowledgable Large Language Model Framework.
MrGiovanni/ModelsGenesis
[MICCAI 2019] [MEDIA 2020] Models Genesis
NVlabs/FasterViT
[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
zjunlp/KnowledgeEditingPapers
Must-read Papers on Knowledge Editing for Large Language Models.
dptech-corp/Uni-Mol
Official Repository for the Uni-Mol Series Methods
salesforce/PCL
PyTorch code for "Prototypical Contrastive Learning of Unsupervised Representations"
NVlabs/GCVit
[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers
mims-harvard/TFC-pretraining
Self-supervised contrastive learning for time series via time-frequency consistency
edenai/edenai-apis
Eden AI: simplify the use and deployment of AI technologies by providing a unique API that connects to the best possible AI engines
ymcui/PERT
PERT: Pre-training BERT with Permuted Language Model
nogibjj/rust-mlops-template
A work in progress to build out solutions in Rust for MLOPs
hjbahng/visual_prompting
Exploring Visual Prompts for Adapting Large-Scale Models
iflytek/MiniRBT
MiniRBT (中文小型预训练模型系列)
ZhangYuanhan-AI/NOAH
Searching prompt modules for parameter-efficient transfer learning.
balavenkatesh3322/audio-pretrained-model
A collection of Audio and Speech pre-trained models.
rsommerfeld/trocr
Powerful handwritten text recognition. A simple-to-use, unofficial implementation of the paper "TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models".
MrGiovanni/SuPreM
[ICLR 2024] Supervised Pre-Trained 3D Models for Medical Image Analysis
YicongHong/Recurrent-VLN-BERT
Code of the CVPR 2021 Oral paper: A Recurrent Vision-and-Language BERT for Navigation
YuanchenBei/Awesome-Pretraining-for-Graph-Neural-Networks
A curated list of papers on pre-training for graph neural networks (Pre-train4GNN).
HKUDS/UrbanGPT
[KDD'2024] "UrbanGPT: Spatio-Temporal Large Language Models"
zjunlp/MolGen
[ICLR 2024] Domain-Agnostic Molecular Generation with Chemical Feedback
siat-nlp/GALAXY
Official repository of the AAAI'2022 paper "GALAXY: A Generative Pre-trained Model for Task-Oriented Dialog with Semi-Supervised Learning and Explicit Policy Injection"
HKUDS/HiGPT
[KDD'2024] "HiGPT: Heterogenous Graph Language Models"
RL4M/MRM-pytorch
An official implementation of Advancing Radiograph Representation Learning with Masked Record Modeling (ICLR'23)
CGCL-codes/AdvEncoder
The implementation of our ICCV 2023 paper "Downstream-agnostic Adversarial Examples"
RUCAIBox/MVP
This repository is the official implementation of our paper MVP: Multi-task Supervised Pre-training for Natural Language Generation.
RunxinXu/ChildTuning
Source code for our EMNLP'21 paper 《Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning》
d-li14/lambda.pytorch
PyTorch implementation of Lambda Network and pretrained Lambda-ResNet
HKUDS/GPT-ST
[NeurIPS'2023] "GPT-ST: Generative Pre-Training of Spatio-Temporal Graph Neural Networks"