jeromerony/dml_cross_entropy
Code for the paper "A unifying mutual information view of metric learning: cross-entropy vs. pairwise losses" (ECCV 2020 - Spotlight)
PythonBSD-3-Clause
Stargazers
- 3DMM-ICME2023
- AndriiTsok@TRYON-Technology
- AppServiceProviderDhaka, Bangladesh
- arjun-kava@videosdk-live
- ArsenLuca
- asanakoyHeidelberg University @CompVis
- beckybaiDuke University
- burnmylettersKyiv
- caoxu915683474@Lenovo Reasearch@BIT
- darkpromise98USTC
- dsx-aishanghai
- fly51flyPRIS
- forenceInstitute of Automation, Chinese Academy of Sciences
- gehaocoolPingan Technology
- hong123123
- huanglianghuaAlibaba DAMO Academy
- huyanxin
- jingyang2017
- kekmodelkakaobank
- keshav47@quantive
- KexianHustHuazhong University of Science and Technology
- kiminh
- L1aoXingyuBeijing, China
- liquor233SJTU
- lliai
- MaggiePasStanford
- mboudiaf
- michalwolsNew York
- NoahsarkUCF
- PkuRainBowSeniorResearcher@MicrosoftResearch
- pranerd
- roger1993Hong Kong
- ruotianluoWaymo
- xiaomengycSydney,Australia
- Zhizh1
- zimenglan-sysu-512sysu