## Abstract

For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection rule, and HHNNs with a projection rule converge in synchronous mode. By computer simulations, we find that the projection rule for HHNNs in synchronous mode maintains a high noise tolerance.

## 1 Introduction

Several multistate models of Hopfield neural networks have been studied in recent years. In particular, many researchers have studied complex-valued Hopfield neural networks (CHNNs) and have often applied them to the storage of image data (Aoki & Kosugi, 2000; Aoki, 2002; Hirose, 2003, 2012, 2013; Isokawa et al., 2018; Jankowski, Lozowski, & Zurada, 1996; Muezzinoglu, Guzelis, & Zurada, 2003; Nitta, 2009; Tanaka & Aihara, 2009; Zheng, 2014). CHNNs have been extended to the following hypercomplex-valued Hopfield neural networks:

Hyperbolic-valued Hopfield neural networks (HHNNs; Kuroe, Tanigawa, & Iima, 2011; Kobayashi, 2013, 2016, 2018c, 2019, 2020).

Quaternion-valued Hopfield neural networks (de Castro & Valle, 2017; Kobayashi, 2017a; Minemoto, Isokawa, Nishimura, & Matsui, 2016, 2017; Isokawa, Nishimura, Kamiura, & Matsui, 2006, 2007, 2008; Isokawa, Nishimura, & Matsui, 2012; Isokawa, Nishimura, Saitoh, Kamiura, & Matsui, 2008; Song & Chen, 2018; Valle, 2014; Valle & de Castro, 2016; Valle & de Castro, 2018).

Commutative quaternion-valued Hopfield neural networks (Isokawa, Nishimura, & Matsui, 2010; Kobayashi, 2018b).

Some of these models are implemented as alternatives to CHNN. HHNN provides the best noise tolerance (Kobayashi, 2018c). A rotor Hopfield neural network (RHNN) is another alternative to CHNN (Kitahara & Kobayashi, 2014). An RHNN is defined using vector-valued neurons and matrix-valued weights. It has double weight parameters of CHNN and HHNN and provides better noise tolerance than a CHNN or an HHNN. Under the usual conditions on the weights, a Hopfield neural network converges to a fixed point in asynchronous mode, whereas it converges or is trapped at a cycle of length 2 in synchronous mode (Kobayashi, 2017b). If it converges in synchronous mode, recall is expected to be much faster than in asynchronous mode. Unfortunately, a CHNN with a projection rule does not converge in synchronous mode. In fact, a CHNN was trapped at a cycle of length 2 by computer simulations (Kobayashi, 2017b). Although an RHNN converges in synchronous mode, it requires double-weight parameters (Kobayashi, 2018a). Thus, it is desirable that an HHNN converges in synchronous mode, because a CHNN and an HHNN have the same number of weight parameters. In this letter, we prove that an HHNN with a projection rule converges in synchronous mode. First, we provide the stability conditions of HHNNs in synchronous mode. Next, we prove that the projection rule satisfies the stability conditions. Thus, an HHNN with a projection rule converges in synchronous mode. We evaluate the HHNNs in synchronous mode by computer simulations, which support that the HHNNs in synchronous mode maintain the high noise tolerance of the asynchronous mode.

## 2 Hyperbolic Hopfield Neural Networks

^{2}. In asynchronous mode, the following stability conditions are provided (Kobayashi, 2018c):

$WT=W$,

$diagW=O$.

When $W$ satisfies the stability conditions, the HHNN converges to a fixed point in asynchronous mode.

## 3 Stability Conditions

We provide the stability conditions for HHNNs in synchronous mode. Since the elements of $W$ are hyperbolic numbers, we have to modify the definition of a nonnegative definite matrix to define the stability conditions for HHNNs in synchronous mode.

Suppose that a hyperbolic matrix $M$ is symmetric. If $Re(zTMz)\u22650$ for all $z$, then $M$ is said to be hyperbolic nonnegative definite.

We provide the stability conditions in synchronous mode:

$W$ is symmetric.

$W$ is hyperbolic nonnegative definite.

In the same way as with CHNNs and RHNNs, we can prove that if $W$ is symmetric, the HHNN converges to a fixed point or is trapped at a cycle of length 2 in synchronous mode (Kobayashi, 2018a). We prove that the HHNN converges to a fixed point, if the stability conditions are satisfied.

If $W$ satisfies the stability conditions, then the HHNN converges to a fixed point in synchronous mode.

We apply theorem ^{2} to the projection rule for HHNNs; that is, we prove that an HHNN employing a projection rule converges to a fixed point in synchronous mode.

$W=ZZTZ-1ZT$ satisfies the stability conditions.

CHNN and HHNN are particular instances of RHNN. It is known that an RHNN employing a projection rule converges in synchronous mode (Kobayashi, 2018a). An HHNN employing the projection rule converges in synchronous mode by theorems ^{2} and ^{3}. However, a CHNN may be trapped at a cycle of length 2 (Kobayashi, 2017b).

## 4 Computer Simulations

We evaluate noise tolerance in synchronous mode by computer simulations and compare it with that in asynchronous mode. First, randomly generated training patterns and impulsive noise are employed. The parameters are $N=200$, $P=10,20,30$, and $K=32,64,128$. One hundred sets of training patterns are generated for a pair of $(K,P)$, and 100 trials are conducted for a training set; the total number of trials is 10,000. Impulsive noise is employed to decay a neuron state at a rate of $r$, where $r$ varies from 0.0 to 0.8 in steps of 0.05. The new state is randomly selected from $V$; the distribution of new states is uniform. If the original training pattern is retrieved, the trial is regarded as successful. Figure 1 shows the simulation result. The HHNNs in synchronous mode slightly outperform those in asynchronous mode. The difference increases as $P$ increases. The noise tolerance is almost independent of $K$ in both modes.

Next, the gray-scale images and gaussian noise are employed as examples of real data. The color images of the CIFAR-10 data set are transformed to 256-level gray-scale images of $32\xd732$ pixels for training data (Krizhevsky, 2009). Therefore, the resolution factor and number of neurons are fixed to $K=256$ and $N=1024$, respectively. Figure 2 shows some samples of gray-scale images. We attempt to retrieve the original images from those with gaussian noise. Figure 3 shows the samples of gray-scale images with gaussian noise. For a pair of $(P,\sigma )$, where $\sigma $ is the standard deviation, 1000 trials are conducted. The simulation result is shown in Figure 4. In both modes, the simulation results are almost identical. Figure 5 shows the average loop counts until convergence for $P=80$. The HHNNs in synchronous mode need twice as many loop counts as in asynchronous mode. All the neurons are simultaneously updated in synchronous mode, whereas neurons are not updated at the same time in asynchronous mode. Therefore, if the neurons of a synchronous HHNN are all updated in parallel, the asynchronous mode requires $N$ times the processing time of a synchronous loop. From the simulation results, asynchronous HHNN is about $N$/2 times slower than the synchronous mode

## 5 Conclusion

Many hypercomplex-valued Hopfield neural networks have been studied. Although stability in synchronous mode is necessary for parallel processing, stability conditions have never been provided for any hypercomplex-valued Hopfield neural network. In this study, we provide stability conditions for HHNN in synchronous mode. We also prove that the projection rule satisfies the stability conditions. Computer simulations show that the noise tolerance in synchronous mode maintains the high noise tolerance of HHNNs. In addition, we show that parallel processing accelerates recall. Although this theory has already been applied to RHNNs, it cannot be applied to CHNNs (Kobayashi, 2017b, 2018a). We plan to study the stability conditions for other hypercomplex-valued Hopfield neural networks.

## Appendix: Algebra of Hyperbolic Numbers

## References

*Proceedings of International Joint Conference on Neural Networks*