Figure 8:
Normalized mutual information for common and private variability. For a given μ, 100 networks were created by drawing common noise weights w from the corresponding log-normal distribution. The mutual information shown is the average across the 100 networks. For a specified network, the mutual information was calculated by averaging KSG estimates over 100 simulated data sets, each containing 10,000 samples. Finally, for a choice of (σP,σC), mutual information is normalized to the maximum across values of μ. (a) Normalized mutual information as a function of μ and private variability (σC=0.5). (b) Normalized mutual information as a function of μ and common variability (σP=0.5).

Normalized mutual information for common and private variability. For a given μ, 100 networks were created by drawing common noise weights w from the corresponding log-normal distribution. The mutual information shown is the average across the 100 networks. For a specified network, the mutual information was calculated by averaging KSG estimates over 100 simulated data sets, each containing 10,000 samples. Finally, for a choice of (σP,σC), mutual information is normalized to the maximum across values of μ. (a) Normalized mutual information as a function of μ and private variability (σC=0.5). (b) Normalized mutual information as a function of μ and common variability (σP=0.5).

Close Modal

or Create an Account

Close Modal
Close Modal