Precision or Accuracy?
kaustubholpadkar opened this issue · 4 comments
When I run the experiments, it displays as follows:
| Task: 3/5 | training loss: 0.407 | training precision: 0.992 |: 100%|███████████████████████████████████| 2000/2000 [08:37<00:00, 3.87it/s]
On-the-fly visdom plots also show precision as y-axis but the title of the plot is average accuracy.
Can you please clarify on that? I have seen same confusion in the code of your other repo: https://github.com/GMvandeVen/continual-learning
Hi, thanks for your question. In this code (and also in the code of https://github.com/GMvandeVen/continual-learning) I have been using the terms "accuracy" and "precision" interchangeably; with both I mean the same thing. Sorry for the confusion! When I have some time I will try to make the usage of these terms in the code more consistent.
Thanks for quick response. I understand that you have used the terms "accuracy" and "precision" interchangeably. But both have different theoritical meaning.
- Accuracy = (TP+TN)/(TP+FP+FN+TN)
- Precision = TP/(TP+FP)
I just want to confirm which one of accuracy or precision is used in these papers.
Thanks for pointing that out, that is very helpful. Sorry, I realise I should have been more careful with these terms.
The performance measure that is reported in this paper (and also in my other ones) is the proportion of samples classified correctly.
I just updated the code and I now only refer to this measure as accuracy. (I'll update the other repositories as well.)
Thank you so much!