Issues
- 2
The score computed by `multiclass_f1_score` for binary classification is wrong. It is not f1 score but accuracy.
#191 opened by LittletreeZou - 0
- 1
- 1
- 3
Disagreement for auroc1 with sklearn
#190 opened by crazyn2 - 2
- 2
RetrievalRecall, RetrievalPrecision require different, 1D input than MulticlassRecall, MulticlassPrecision which accept batch input
#188 opened by jaanli - 3
get_module_summary assertion fail
#144 opened by quancs - 2
- 3
Add Wasserstein Distance
#137 opened by bobakfb - 1
Updating `Mean` with 0 leads to 'No calls to update() have been made...' warning
#185 opened by dburian - 5
Disagreement for macro f1 with torchmetrics and sklearn
#176 opened by gounley - 4
Multiple metrics sharing the same state
#170 opened by ZhiyuanChen - 2
Error in masking in the function multiclass_recall
#150 opened by xiangn95 - 1
- 1
RuntimeError: "bitwise_and_cpu" not implemented for 'Float' when using binary_precison & binary_recall
#146 opened by kidpaul94 - 2
FLOPs and ModuleSummary Documentation
#152 opened by BenjaminHelyer - 3
- 1
Add COCO mAP
#139 opened by songyuc - 2
BinaryAccuracy over 1 when labels are boolean.
#135 opened by tcapelle - 1
Can't import under torch 1.11
#119 opened by gwenzek - 1
[Proposal] Verify that Input Tensor is a probability in classification metrics
#115 opened by maronuu - 3
Multi Process Error
#92 opened by go-with-me000 - 3
can not find package
#89 opened by go-with-me000 - 6
Comparison to torchmetrics
#82 opened by rsokl - 5
Add CTR metric
#27 opened by dbish