## Abstract

A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.

## 1 Introduction

A complex-valued Hopfield neural network (CHNN) is one of the first multistate models of Hopfield neural networks and has been studied by many researchers (Aizenberg, Ivaskiv, Yu, Pospelov, & Hudiakov, 1971, 1973; Jankowski, Lozowski, & Zurada, 1996; Noest, 1988). The CHNN has been used as neural associative memories and applied to storage of multilevel image data (Aoki & Kosugi, 2000; Aoki, 2002; Lee, 2006; Muezzinoglu, Guzelis, & Zurada, 2003; Tanaka & Aihara, 2009; Zheng, 2014). It has also been extended using hypercomplex numbers (Hitzer, Nitta, & Kuroe, 2013). For example, several models of quaternion-valued Hopfield neural networks (QHNNs) have been proposed. Isokawa et al. (Isokawa, Nishimura, Kamiura, & Matsui, 2006, 2007, 2008) proposed a QHNN with a split activation function. The activation function that was extended to a phasor-represented activation function (Isokawa, Nishimura, Saitoh, Kamiura, & Matsui, 2008; Isokawa, Nishimura, & Matsui, 2009; Minemoto, Isokawa, Nishimura, & Matsui, 2016). A continuous activation function for QHNNs has also been proposed (Valle & de Castro, 2018; de Castro & Valle, 2017; Valle, 2014). Bicomplex numbers, also referred to as commutative quaternions, are hypercomplex numbers of dimension 4, like quaternions. Isokawa, Nishimura, and Matsui (2010) proposed a bicomplex-valued Hopfield neural network (BHNN) with a phasor-represented activation function.

Several models have been proposed as alternatives of CHNN. The models with more weight parameters can provide better noise tolerance. Since the weight parameters need a lot of resources, the number of these parameters should be reduced. Twin-multistate activation functions were introduced to the QHNNs and BHNNs for reduction in the number of weight parameters (Kobayashi, 2017, 2018a). However, the architectures of a QHNN and a BHNN with twin-multistate activation functions are more complicated than that of a CHNN. Since a CHNN has simple architecture and has been the most widely used multistate Hopfield model, reducing the weight parameters of a CHNN is desirable. In this work, the number of weight parameters of a CHNN is reduced by the bicomplex projection rule. A projection rule is a one-shot learning algorithm, and the learning speed is fast, unlike iterative learning algorithm, such as gradient descent learning and relaxation algorithm. Gradient descent learning was proposed by Lee (2001) and was improved by Kobayashi et al. (Kobayashi, 2016; Kobayashi, Yamada, & Kitahara, 2011). The relaxation learning was proposed by Muezzinoglu et al. (2003) and was reformulated by Kobayashi (2008). Although the Hebbian rule is a one-shot learning algorithm, the storage capacity is too small (Jankowski et al., 1996). A CHNN with a bicomplex projection rule has only half the weight parameters of a conventional CHNN, like a QHNN and a BHNN with twin-multistate activation functions. The proposed CHNN has much simpler architecture than the QHNN and BHNN. A bicomplex projection rule is obtained by decomposition of a BHNN with a projection rule. In addition, the proposed CHNN improves the noise tolerance of BHNN by removing self-feedbacks. BHNN's feedback is a major factor of reduction in noise tolerance. To evaluate the bicomplex projection rule, we compare the proposed CHNN with the QHNN and BHNN by computer simulations. They have half the weight parameters of a conventional CHNN.

## 2 Complex-Valued Hopfield Neural Networks

## 3 Bicomplex-Valued Hopfield Neural Networks

## 4 Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks

## 5 Computer Simulations

In both simulations, only the noise tolerance of BHNN rapidly deteriorates with an increase in $P$. Although the CHNN with the bicomplex projection rule is just a decomposition of BHNN, it is more robust against an increase in $P$. In other words, the decomposition of BHNN improves the noise tolerance. The chief difference is whether there are self-feedbacks. In many different models, such as hyperbolic and rotor Hopfield neural networks, the similar results have been pointed out and discussed (Kitahara & Kobayashi, 2014; Kobayashi, 2018b, 2020).

## 6 Conclusion

To reduce the number of weight parameters, twin-multistate activation functions have been introduced to QHNNs and BHNNs. Since their architectures are complicated, alternatives with simple architectures are necessary. In this work, the bicomplex projection rule is introduced to the CHNNs, where two connections share a weight parameter by decomposition of BHNN. This method can remove self-feedback and improves the noise tolerance of BHNN. Computer simulations support that the noise tolerance of proposed CHNNs is better than or equals that of QHNNs. Table 1 summarizes the self-feedbacks and update counts of a loop. The noise tolerance of BHNNs deteriorates by the self-feedbacks. Since a QHNN and a BHNN can update two multistate components simultaneously, their update counts of a loop are that of CHNN.

Model . | Self-Feedbacks . | Update Count . |
---|---|---|

QHNN | Do not exist | $N/2$ |

BHNN | Exist | $N/2$ |

Proposed | Do not exist | $N$ |

Model . | Self-Feedbacks . | Update Count . |
---|---|---|

QHNN | Do not exist | $N/2$ |

BHNN | Exist | $N/2$ |

Proposed | Do not exist | $N$ |