Figure 1:
Single-layer network architecture with k multicompartmental neurons for outputting the sum of the canonical correlation subspace projections (CCSPs) z=(z1,…,zk). See algorithm 2. Here a=Wxx and b=Wyy are projections of the views x=(x1,…,xm) and y=(y1,…,yn) onto a common k-dimensional subspace. The output, z=M-1(a+b), is the sum of the CCSPs and is computed using recurrent lateral connections. The components of a, b, and z are represented in three separate compartments of the neurons. Filled circles denote non-Hebbian synapses, and empty circles denote anti-Hebbian synapses. Importantly, each synaptic update depends only on variables represented locally.

Single-layer network architecture with k multicompartmental neurons for outputting the sum of the canonical correlation subspace projections (CCSPs) z=(z1,,zk). See algorithm 2. Here a=Wxx and b=Wyy are projections of the views x=(x1,,xm) and y=(y1,,yn) onto a common k-dimensional subspace. The output, z=M-1(a+b), is the sum of the CCSPs and is computed using recurrent lateral connections. The components of a, b, and z are represented in three separate compartments of the neurons. Filled circles denote non-Hebbian synapses, and empty circles denote anti-Hebbian synapses. Importantly, each synaptic update depends only on variables represented locally.

Close Modal

or Create an Account

Close Modal
Close Modal