Pinned Repositories
DetGPT
GeoKG
geographic knowledge graph
graph-databases-use-cases
Example use cases from the O'Reilly Graph Databases book
GraphLite
A lightweight graph computation platform in C/C++
kg-beijing
北京知识图谱学习小组
knowledge
Go社区的知识图谱,Knowledge Graph
LAVIS
LAVIS - A One-stop Library for Language-Vision Intelligence
LLaVA
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
MiniGPT-4
MiniGPT-4: Enhancing Vision-language Understanding with Advanced Large Language Models
neo4j-nlp
NLP Capabilities in Neo4j
LinMu7177's Repositories
LinMu7177/MiniGPT-4
MiniGPT-4: Enhancing Vision-language Understanding with Advanced Large Language Models
LinMu7177/DetGPT
LinMu7177/GeoKG
geographic knowledge graph
LinMu7177/graph-databases-use-cases
Example use cases from the O'Reilly Graph Databases book
LinMu7177/GraphLite
A lightweight graph computation platform in C/C++
LinMu7177/kg-beijing
北京知识图谱学习小组
LinMu7177/knowledge
Go社区的知识图谱,Knowledge Graph
LinMu7177/LAVIS
LAVIS - A One-stop Library for Language-Vision Intelligence
LinMu7177/LLaVA
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
LinMu7177/neo4j-nlp
NLP Capabilities in Neo4j
LinMu7177/NLP
LinMu7177/NTU_ML2017_Hung-yi-Lee_HW
NTU ML2017 Spring and Fall Homework Hung-yi_Li 李宏毅老师 机器学习课程作业
LinMu7177/Osprey
The code for "Osprey: Pixel Understanding with Visual Instruction Tuning"
LinMu7177/realworldnlp
Example code for "Real-World Natural Language Processing"
LinMu7177/ReLA
[CVPR2023 Highlight] GRES: Generalized Referring Expression Segmentation
LinMu7177/rich-text-to-image
Rich-Text-to-Image Generation
LinMu7177/Segment-Everything-Everywhere-All-At-Once
[NeurIPS 2023] Official implementation of the paper "Segment Everything Everywhere All at Once"
LinMu7177/Sem-K-BERT
Sem-K-BERT, which can enhance BERT by using knowledge graph and semantic role tag information , and we used location coding to ensure that the normal work of external information would not be interfered with each other. Experiments show that our model has achieved better performance on a wide range of Chinese nlp tasks. This shows that the simultaneous use of knowledge graph and semantic role labeling information can bring greater improvement to BERT.
LinMu7177/SoM
Set-of-Mark Prompting for LMMs
LinMu7177/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities