NIOSHTIC-2 Publications Search
k-Nearest neighbor based consistent entropy estimation for hyperspherical distributions.
Li-S; Mnatsakanov-RM; Andrew-ME
Entropy 2011 Mar; 13(3):650-667
A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable.
Mathematical-models; Molecular-biology; Quantitative-analysis; Standards; Statistical-analysis; Author Keywords: hyperspherical distribution; directional data; differential entropy; cross entropy; Kullback-Leibler divergence; k-nearest neighbor
Shengqiao Li, Health Effects Laboratory Division, National Institute for ccupational Safety and Health, Morgantown, WV 26505
Issue of Publication
Services: Public Safety
Page last reviewed: March 11, 2019
Content source: National Institute for Occupational Safety and Health Education and Information Division