Pinned Repositories
.github
About Us
BioInstTune-LLM
This repository contains the code and data related to the experiments in the paper "Exploring the Effectiveness of Instruction Tuning in Biomedical Language Processing".
Compact-Biomedical-Transformers
This repository contains the code used for distillation and fine-tuning of compact biomedical transformers that have been introduced in the paper "On The Effectiveness of Compact Biomedical Transformers"
efficient-ml
Experiments towards efficient use of compact biomedical LMs
Lightweight-Clinical-Transformers
This project develops compact transformer models tailored for clinical text analysis, balancing efficiency and performance for healthcare NLP tasks.
MiniALBERT
This repository contains the code used for training/fine-tuning the models introduced in the paper "MiniALBERT: Model Distillation via Parameter-Efficient Recursive Transformers".
pandemic-pact-ppace
umls-to-snomed-icd10-mapper
UMLS to SNOMED CT and ICD-10 mapping tool: A Python script for generating JSON files containing mappings between UMLS CUIs and SNOMED CT or ICD-10 codes using the UMLS Metathesaurus. Supports exact match, broad relations, and parent-child hierarchy mapping methods.
NLPie Research's Repositories
nlpie-research/Lightweight-Clinical-Transformers
This project develops compact transformer models tailored for clinical text analysis, balancing efficiency and performance for healthcare NLP tasks.
nlpie-research/Compact-Biomedical-Transformers
This repository contains the code used for distillation and fine-tuning of compact biomedical transformers that have been introduced in the paper "On The Effectiveness of Compact Biomedical Transformers"
nlpie-research/umls-to-snomed-icd10-mapper
UMLS to SNOMED CT and ICD-10 mapping tool: A Python script for generating JSON files containing mappings between UMLS CUIs and SNOMED CT or ICD-10 codes using the UMLS Metathesaurus. Supports exact match, broad relations, and parent-child hierarchy mapping methods.
nlpie-research/MiniALBERT
This repository contains the code used for training/fine-tuning the models introduced in the paper "MiniALBERT: Model Distillation via Parameter-Efficient Recursive Transformers".
nlpie-research/efficient-ml
Experiments towards efficient use of compact biomedical LMs
nlpie-research/.github
About Us
nlpie-research/BioInstTune-LLM
This repository contains the code and data related to the experiments in the paper "Exploring the Effectiveness of Instruction Tuning in Biomedical Language Processing".
nlpie-research/pandemic-pact-ppace