Normalized mutual information for common and private variability. For a given , 100 networks were created by drawing common noise weights from the corresponding log-normal distribution. The mutual information shown is the average across the 100 networks. For a specified network, the mutual information was calculated by averaging KSG estimates over 100 simulated data sets, each containing 10,000 samples. Finally, for a choice of , mutual information is normalized to the maximum across values of . (a) Normalized mutual information as a function of and private variability (). (b) Normalized mutual information as a function of and common variability ().
This site uses cookies. By continuing to use our website, you are agreeing to our privacy policy. No content on this site may be used to train artificial intelligence systems without permission in writing from the MIT Press.