multi-modal-search
There are 6 repositories under multi-modal-search topic.
haofanwang/natural-language-joint-query-search
Search photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Zabuzard/Cobweb
Cobweb is a multi-modal journey planner offering a server based REST API and a light frontend.
emmyoh/zebra
A vector database for querying meaningfully similar data.
cherrry-ai/cherrry-js
Cherrry Javascript SDK
sagarverma/MultiModalSearch
CSE508-Information Retrieval course project on Multi modal search using deep learning.
ThuyHaLE/FrameFinderLE
FrameFinderLE is an advanced image and video frame retrieval system that enhances CLIP's image-text pairing with hashtag refinement and user feedback, offering an intuitive search experience.