A demo can be found in the test file.
Introduction (Hierarchical Nearest Neighbor Descent, In-Tree, and Clustering, Pattern Recognition, 2023)
Recently, we have proposed a physically-inspired graph-theoretical method, called the Nearest Descent (ND), which is capable of organizing a dataset into an in-tree graph structure. Due to some beautiful and effective features, the constructed in-tree proves well-suited for data clustering. Although there exist some undesired edges (i.e., the inter-cluster edges) in this in-tree, those edges are usually very distinguishable, in sharp contrast to the cases in the famous Minimal Spanning Tree (MST). Here, we propose another graph-theoretical method, called the Hierarchical Nearest Neighbor Descent (HNND). Like ND, HNND also organizes a dataset into an in-tree, but in a more efficient way. Consequently, HNND-based clustering (HNND-C) is more efficient than ND-based clustering (ND-C) as well. This is well proved by the experimental results on five high-dimensional and large-size mass cytometry datasets. The experimental results also show that HNND-C achieves overall better performance than some state-of-the-art clustering methods.
See detais in https://www.sciencedirect.com/science/article/abs/pii/S0031320323000018.
A 50 days' free access (before March 03, 2023) to the articel is generously provided by Pattern Recognition at https://authors.elsevier.com/c/1gPhA77nKkWRI.