neulab/explainaboard_web

Confusion matrix interface suggestions

Opened this issue · 2 comments

The new "confusion matrix" feature is a great start, but I felt the interface is a bit hard to understand:

Screen Shot 2022-10-22 at 6 47 21 AM

  1. "F1 by confusion matrix" is not correct. It should just be "confusion matrix".
  2. And also, usually a "confusion matrix" will have, for each true tag true_tag and predicted tag predicted_tag: cooccurrence(true_tag, predicted_tag)/count(true_tag), but right now it's just cooccurrence(true_tag, predicted_tag)/total_count. So we should either change the naming or the calculation.

cc @PaulCCCCCCH

Thanks for creating the issue.

  1. Currently we display <metric> by <feature description> as the title. I guess we can simply remove the metric.
  2. For the accuracy, I can fix the calculation is another PR.

By the way, the axes seem weird. Are these the real feature values for the task, or is there a bug?

Thanks!

These are real values for a task. It's a very small example dataset though, so that's why the confusion is sparse.