/Touch100k

Primary LanguagePython

Touch100k

🚀🚀🚀 The implementation of Touch100k: A Large-Scale Touch-Language-Vision Dataset for Touch-Centric Multimodal Representation.

❤️ Acknowledgement

  • LanguageBind: An open source, language-based multimodal pre-training framework. Thanks for their wonderful work.
  • OpenCLIP: An amazing open-sourced backbone.

🔒 License

Code License This project is under the MIT license.

Data License The dataset is CC BY NC 4.0 (allowing only non-commercial use) and models trained using the dataset should not be used outside of research purposes.