Skip Nav Destination
1-1 of 1
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Publisher: Journals Gateway
Neural Computation (2015) 27 (10): 2097–2106.
Published: 01 October 2015
FIGURES | View All (26)
AbstractView article PDF
We compare an entropy estimator recently discussed by Zhang ( 2012 ) with two estimators, and , introduced by Grassberger ( 2003 ) and Schürmann ( 2004 ). We prove the identity , which has not been taken into account by Zhang ( 2012 ). Then we prove that the systematic error (bias) of is less than or equal to the bias of the ordinary likelihood (or plug-in) estimator of entropy. Finally, by numerical simulation, we verify that for the most interesting regime of small sample estimation and large event spaces, the estimator has a significantly smaller statistical error than .