multi-modal-search

There are 6 repositories under multi-modal-search topic.

  • haofanwang/natural-language-joint-query-search

    Search photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.

    Language:Jupyter Notebook2123420
  • Zabuzard/Cobweb

    Cobweb is a multi-modal journey planner offering a server based REST API and a light frontend.

    Language:Java14103
  • emmyoh/zebra

    A vector database for querying meaningfully similar data.

    Language:Rust8200
  • cherrry-ai/cherrry-js

    Cherrry Javascript SDK

    Language:JavaScript4100
  • sagarverma/MultiModalSearch

    CSE508-Information Retrieval course project on Multi modal search using deep learning.

    Language:Python2400
  • ThuyHaLE/FrameFinderLE

    FrameFinderLE is an advanced image and video frame retrieval system that enhances CLIP's image-text pairing with hashtag refinement and user feedback, offering an intuitive search experience.

    Language:Python0100