The successor representation is known to relate to temporal associations learned in the temporal context model (Gershman et al., 2012), and subsequent work suggests a wide relevance of the successor representation across spatial, visual, and abstract relational tasks. I demonstrate that the successor representation and purely associative learning have an even deeper relationship than initially indicated: Hebbian temporal associations are an unnormalized form of the successor representation, such that the two converge on an identical representation whenever all states are equally frequent and can correlate highly in practice even when the state distribution is nonuniform.

You do not currently have access to this content.