/HMGN

Hierarchical Multi-Granularity Classification based on Residual Gated GCN

Primary LanguagePythonMIT LicenseMIT

HMGN

Hierarchical Multi-Granularity Classification based on Residual Gated GCN

Paper

In this paper, we investigate hierarchical multi-granularity classification with training samples labeled at different levels. Accordingly, we propose the combinatorial loss and a hierarchical residual network (HRN) for hierarchical feature interaction. This repo implements an alternative hierarchical feature interaction network (HMGN) based on the same combinatorial loss.

Network Architecture

avatar

  1. The trunk net (ResNet-50) produces the feature map;
  2. Pre-trained GloVe model generates word vectors for each class in the hierarchy;
  3. Word vectors interact with the feature map using a low-rank bilinear pooling method to generate semantic guided attention coefficients;
  4. We perform weighted average pooling over all locations in the feature map to obtain the initial feature vector for each node in the graph;
  5. In the graph, each node represents a class in the hierarchy, and children nodes connect to their parents with undirected edges;
  6. We adopt the residual gated GCN to perform feature interaction between nodes;
  7. The node classifier performs element-wise multiplication with the matrix formed by the feature vectors of each node. We perform average pooling for each class on the results to output the final vector used for classification.

Requirements

  • Python 3.7
  • Pytorch 1.3.1
  • torchvision 0.4.2
  • networkx 2.3
  • CUDA 10.2

Experiments

We perform experiments on CUB-200-2011 and compare two different hierarhical networks with the same combinatorial loss. Supporting files can be found in the related repo: HRN.

OA (%) results on CUB-200-2011 with the relabeling proportion 0% by comparing two hierarhical networks:

Levels HRN HMGN
Orders 98.67 98.74
Families 95.51 95.27
Species 86.60 85.27

References

  1. Benchmarking graph neural networks