Nearest neighbor estimates of entropy.
Singh-H; Misra-N; Hnizdo-V; Fedorowicz-A; Demchuk-E
Am J Math Manage Sci 2003 Jul; 23(3-4):301-321
Motivated by the problems in molecular sciences, we introduce new parametric estimates of entropy which are based on the kth nearest neighbor distances between the n sample points, where k (< n - 1) is a fixed positive integer. These provide competing esitmators to an estimator proposed by Kozachenko and Leonenko (1987), which is based on the first nearest neighbor distances of the sample points. These estimators are helpful in the evaluation of entropies of random vectors. We establish the asymptotic unbiasedness and consistency of the proposed estimators. For some standard distributions, we also investigate their performance for finite sample sizes using Monte Carlo simulations. The proposed estimators are applied to estimate the entropy of internal rotation in the methanol molecule, which can be characterized by a one-dimensional random vector, and of diethyl ether, which is described by a four-dimensional random vector.
Sampling; Sampling-methods; Simulation-methods; Ethers; Statistical-analysis; Mathematical-models;
Author Keywords: Bias; diethyl ether; enthalpy; entropy; free energy; internal rotation; kth nearest neighbor; methanol; molecular dynamics simulations; root mean squared error; torsional angles
Harshinder Singh, Department of Statistics, West Virginia University, Morgantown, WV 26506-6330, USA
American Journal of Mathematical and Management Sciences