cheind/tf-matplotlib

Use with TF-Slim and interpret results

margaux-schorn opened this issue · 3 comments

Hello, I'm a beginner in neural network and i found this library which seems quite useful to show the evolution of accuracy by label with confusion matrix. But my problem is that i'm using the code from TF-Slim research project, so the exemple provided for confusion matrix is not really suiting my needs.
However, I manage to make it work but i'm not sure of the results that I'm seeing. Is it normal that the display in Tensorboard only show one or two columns with always same rows (for example, row 0 is never in color) ? Sometimes, it also happens that I only have "1.00" displayed in every colored case, so I'm wondering if I did everything correctly. (You can see it below)
confusion_matrix
(Sorry if my question seems not clear, this is the first time I create an issue on github. PS my labels are in french)

having no entries accross a row means that this class has not yet seen a sample. This could happen if you are very early in the training process, or more likely you are only visualizing the confusion matrix of a single batch. MNIST is so small that I can process all test samples within a single batch. So for larger training sets you might do something along the following lines:

for each iteration in training:
  if i % nth == 0:
    initialize a confusion matrix to zero
    for each batch in validation set:
      update confusion matrix with result of prediction(batch)
    visualize confusion matrix

I realize the fact that the MNIST test set fits in a single batch is a hidden detail in the example that might confuse other people as well. I will add a comment to highlight the fact.

Thanks for your answer. It's indeed a small training process, because i'm juste making tests for the moment. I understand what you mean for the larger training sets, but I don't really know how to manage it with TF-Slim because it's not the same way of writing that in the simple Tensorflow way, but I will search for a solution.
Anyway, thanks for the library, it will help a lot, I think.