Complex Valued Recurrent Neural Network: From Architecture to Training


Recurrent Neural Networks were invented a long time ago, and dozens of different architectures have been published. In this paper we generalize recurrent architectures to a state space model, and we also generalize the numbers the network can process to the complex domain. We show how to train the recurrent network in the complex valued case, and we present the theorems and procedures to make the training stable. We also show that the complex valued recurrent neural network is a generalization of the real valued counterpart and that it has specific advantages over the latter. We conclude the paper with a discussion of possible applications and scenarios for using these networks.

Share and Cite:

A. Minin, A. Knoll and H. Zimmermann, "Complex Valued Recurrent Neural Network: From Architecture to Training," Journal of Signal and Information Processing, Vol. 3 No. 2, 2012, pp. 192-197. doi: 10.4236/jsip.2012.32026.

Conflicts of Interest

The authors declare no conflicts of interest.


[1] H. G. Zimmermann and R. Neuneier, “Modeling Dynamical Systems by Recurrent Neural Networks, Data Mining II,” Second International Conference on Data Mining, Cambridge, 5-7 July 2000, pp. 557-566.
[2] S.-L. Gohand and D. Mandic, “A Complex-Valued RTRL Algorithm for Recurrent Neural Networks,” Neural Computation, Vol. 16, No. 12, 2006, pp. 2699-2713.
[3] P. Mandic, “Complex Valued Recurrent Neural Networks for Noncircular Complex Signals,” International Joint Conference on Neural Networks, 14-19 June 2009, pp. 1987-1992. doi:10.1109/IJCNN.2009.5178960
[4] T. Adali and H. Li, “A Practical Formulation for Computation of Complex Gradients and its Application to Maximum Likelihood ICA,” 2007 IEEE International Conference on Acoustics, Speech and Signal Processing, Vol. 2, 2007, pp. II-633-II-636. doi:10.1109/ICASSP.2007.366315
[5] D. P. Xu, H. S. Zhang and L. J. Liu, “Convergence Analysis of Three Classes of Split-Complex Gradient Algorithms for Complex-Valued Recurrent Neural Networks,” Neural Computation, Vol. 22, No. 20, 2010, pp. 2655-2677. doi:10.1162/NECO_a_00021
[6] F. Takens, “Detecting Strange Attractors in Turbulence, Dynamical Systems and Turbulence,” Lecture Notes in Mathematics, Vol. 898, Springer-Verlag, New York, 1981, pp. 366-381.
[7] H. Leung and S. Haykin, “The Complex Back Propagation,” IEEE Transactions on Signal Processing, Vol. 39, No. 9, 1991, pp. 2101-2104. doi:10.1109/78.134446
[8] G. Zimmermann, A. Minin and V. Kusherbaeva, “Historical Consistent Complex Valued Recurrent Neural Network,” Lecture Notes in Computer Science, Part 1, Vol. 6791, 2011, pp. 185-192. doi:10.1007/978-3-642-21735-7_23
[9] H.-G. Zimmermann, A. Minin and V. Kusherbaeva, “Comparison of the Complex Valued and Real Valued Neural Networks Trained with Gradient Descent and Random Search Algorithms,” 19th European Symposium on Artificial Neural Networks, Bruges, 27-29 April 2011, pp. 216-222.
[10] D. H. Brandwood, “A Complex Gradient Operator and Its Application in Adaptive Array Theory,” IEEE Proceedings, F: Communications, Radar and Signal Processing, Vol. 130, No. 1, 1983, p. 1116.
[11] D. Johnson, “Optimization Theory,” Optimization Theory Page from the Connexions Project.
[12] P. Kalra, A. Gangal and D. Chauhan, “Performance Evaluation of Complex Valued Neural Networks Using Various Error Functions,” World Academy of Science, Engineering and Technology, Vol. 29, 2007, pp. 27-32.
[13] H. L. Li and T. Adali, “A Class of Complex ICA Algorithms Based on the Kurtosis Cost Function,” IEEE Transactions on Neural Networks, Vol. 19, No. 3, 2008, pp. 408-420. doi:10.1109/TNN.2007.908636
[14] R. Remmert, “Theory of Complex Functions,” Springer, New York, 1991. doi:10.1007/978-1-4612-0939-3

Copyright © 2022 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.