Approximation by fully complex multilayer Perceptrons

被引:154
作者
Kim, T
Adali, T
机构
[1] Mitre Corp, Mclean, VA 22102 USA
[2] Univ Maryland Baltimore Cty, Dept Comp Sci & Elect Engn, Baltimore, MD 21250 USA
关键词
D O I
10.1162/089976603321891846
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate the approximation ability of a multilayer perceptron (MLP) network when it is extended to the complex domain. The main challenge for processing complex data with neural networks has been the lack of bounded and analytic complex nonlinear activation functions in the complex domain, as stated by Liouville's theorem. To avoid the conflict between the boundedness and the analyticity of a nonlinear complex function in the complex domain, a number of ad hoc MLPs that include using two real-valued MLPs, one processing the real part and the other processing the imaginary part, have been traditionally employed. However, since nonanalytic functions do not meet the Cauchy-Riemann conditions, they render themselves into degenerative backpropagation algorithms that compromise the efficiency of nonlinear approximation and learning in the complex vector field. A number of elementary transcendental functions (ETFs) derivable from the entire exponential function e(z) that are analytic are defined as fully complex activation functions and are shown to provide a parsimonious structure for processing data in the complex domain and address most of the shortcomings of the traditional approach. The introduction of ETFs, however, raises a new question in the approximation capability of this fully complex MLP. In this letter, three proofs of the approximation capability of the fully complex MLP are provided based on the characteristics of singularity among ETFs. First, the fully complex MLPs with continuous ETFs over a compact set in the complex vector field are shown to be the universal approximator of any continuous complex mappings. The complex universal approximation theorem extends to bounded measurable ETFs possessing a removable singularity. Finally, it is shown that the output of complex MLPs using ETFs with isolated and essential singularities uniformly converges to any nonlinear mapping in the deleted annulus of singularity nearest to the origin.
引用
收藏
页码:1641 / 1666
页数:26
相关论文
共 33 条
[31]  
Silverman H., 1975, Complex Variables
[32]   Complex-valued neural networks with adaptive spline activation function for digital radio links nonlinear equalization [J].
Uncini, A ;
Vecci, L ;
Campolucci, P ;
Piazza, F .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1999, 47 (02) :505-514
[33]   Nonlinear blind equalization schemes using complex-valued multilayer feedforward neural networks [J].
You, C ;
Hong, D .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (06) :1442-1455