ZhaohanM's Stars
DeepGraphLearning/AStarNet
Official implementation of A* Networks
zjukg/KG-MM-Survey
Knowledge Graphs Meet Multi-Modal Learning: A Comprehensive Survey
tatonetti-lab/sex_risks
Investigating drugs with sex specific risks of adverse drug reactions
zjukg/KG-LLM-Papers
[Paper List] Papers integrating knowledge graphs (KGs) and large language models (LLMs)
mims-harvard/decagon
Graph convolutional neural network for multirelational link prediction
cambridgeltl/mop
Codes for paper: Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT
ZhaohanM/FusionGDA
we propose a novel FusionGDA model, which utilises a pre-training phase with a fusion module to enrich the gene and disease semantic representations encoded by pre-trained language models.
ZhaohanM/FusionDTI
FusionDTI utilises a Token-level Fusion module to effectively learn fine-grained information for Drug-Target Interaction Prediction.
IBM/materials
Foundation Model for Materials - FM4M
jyfang6/trace
QizhiPei/Awesome-Biomolecule-Language-Cross-Modeling
Awesome-Biomolecule-Language-Cross-Modeling: a curated list of resources for paper "Leveraging Biomolecule and Natural Language through Multi-Modal Learning: A Survey"
DeepGraphLearning/ProtST
[ICML-23 ORAL] ProtST: Multi-Modality Learning of Protein Sequences and Biomedical Texts
bowen-gao/DrugCLIP
[NeurIPS 2023] DrugCLIP: Contrastive Protein-Molecule Representation Learning for Virtual Screening
LirongWu/awesome-protein-representation-learning
Awesome Protein Representation Learning
snap-stanford/GEARS
GEARS is a geometric deep learning model that predicts outcomes of novel multi-gene perturbations
westlake-repl/SaProt
[ICLR'24 spotlight] Saprot: Protein Language Model with Structural Alphabet
mengzaiqiao/ProtTrans
ProtTrans is providing state of the art pretrained language models for proteins. ProtTrans was trained on thousands of GPUs from Summit and hundreds of Google TPUs using Transformers Models.
lucidrains/bidirectional-cross-attention
A simple cross attention that updates both the source and target in one step
RElbers/info-nce-pytorch
PyTorch implementation of the InfoNCE loss for self-supervised learning.
cambridgeltl/sapbert
[NAACL'21 & ACL'21] SapBERT: Self-alignment pretraining for BERT & XL-BEL: Cross-Lingual Biomedical Entity Linking.