Microsoft Word - Recognition-ed.doc ETASR - Engineering, Technology & Applied Science Research Vol. 2, �o. 1, 2012, 167-172 167 www.etasr.com Zjavka: Recognition of Generalized Patterns by a Differential Polynomial �eural �etwork Recognition of Generalized Patterns by a Differential Polynomial Neural Network Ladislav Zjavka Faculty of Management Science and Informatics University of Žilina Žilina, Slovakia lzjavka@gmail.com Abstract — A lot of problems involve unknown data relations, identification of which can serve as a generalization of their qualities. Relative values of variables are applied in this case, and not the absolute values, which can better make use of data properties in a wide range of the validity. This resembles more to the functionality of the brain, which seems to generalize relations of variables too, than a common pattern classification. Differential polynomial neural network is a new type of neural network designed by the author, which constructs and approximates an unknown differential equation of dependent variables using special type of root multi-parametric polynomials. It creates fractional partial differential terms, describing mutual derivative changes of some variables, likewise the differential equation does. Particular polynomials catch relations of given combinations of input variables. This type of identification is not based on a whole-pattern similarity, but only to the learned hidden generalized relations of variables. Keywords - polynomial neural network; dependence of variables identification; differential equation approximation; rational integral function I. INTRODUCTION The principal disadvantage of the artificial neural network (ANN) identification in general is the disability of input pattern generalization. ANNs can learn to classify any input patterns but utilize only the absolute values of variables. However, the latter may differ significantly while their relations may be the same. That is why ANNs are able to correctly recognize only similar or incomplete patterns compared to the train set. If the input considered is e.g. a shape moved or sized in the input matrix, the neural network identification will fail. An approach to look at the input vector of variables not as a “pattern” but as a dependent bound point set of N-dimensional space could be attempted. A neural network, which would be able to learn and identify any unknown data relations, is to contain a multi- parametric polynomial functions to catch partial dependence of given inputs. Its response would be the same to all patterns (sets) which variables are performed with the trained dependence, regardless of the actual values [9]. Biological neural cell seems to apply a similar principle. Its dendrites collect electrical signals coming from other neurons. But unlike the artificial neuron, some of the signals already interact in single branches (dendrites) of a neural cell (see Figure 1), likewise the multiplied variables of a multi-parametric polynomial do. Parameters of polynomial terms can represent the synopsis of the cell dendrites. These weighted combinations are summed in the body cell and transformed into relative values using time-delayed dynamic periodic activation functions (the activated neural cell generates series of time-delayed output pulses, in response to its input signals). Axon passes electrical pulse signals on to dendrites of other neural or effector cells [1]. The period of this function depends on some input variables and seems to represent the derivative part of a partial term of a differential equation composition. Differential polynomial neural network (D-PNN) constructs and tries to approximate an unknown differential equation describing relations of input variables that are not entirely patterns. It forms its output as a generalization of input patterns similar to the ones utilized by the human brain. It creates a structural model of any unknown relationships of input variables description. D-PNN is based on GMDH (Group Method of Data Handling) polynomial neural network, which was created by the Ukrainian scientist Aleksey Ivakhnenko in 1968, when the back-propagation technique was not known yet. He attempted to decompose the complexity of a process into many simpler relationships each described by a low order 2-variable polynomial processing function of a single neuron [2]. Fig. 1. A biological neural cell ETASR - Engineering, Technology & Applied Science Research Vol. 2, �o. 1, 2012, 167-172 168 www.etasr.com Zjavka: Recognition of Generalized Patterns by a Differential Polynomial �eural �etwork II. DIFFERENTIAL POLYNOMIAL NEURAL NETWORK The basic idea of the D-PNN is to create and approximate a differential equation (DE) (3), which is not known in advance [3], with a special type of root (power) fractional multi- parametric polynomials (5). ∑∑∑∑ ∞ ==== ==+ ∂∂ ∂ + ∂ ∂ + 11 2 11 0... k k n j ji ij n i n i i i uu xx u c x u ba (3) u(x1, x2,, … , xn) - searched function of all input variables a, B(b1, b2,, ..., bn), C(c11, c12, ,... ) - parameters Fourier’s method of partial DE solution searches the solution in a form of the product of 2 functions, of which at least 1 depends only on 1 variable. A partial derivation of function z(x, y) of 2 input variables x, y can be expressed by (4) [4]. )()( ),( 21 zfxf x yxz ⋅= ∂ ∂ (4) Elementary methods of a differential equation solution express the solution in special elementary functions – polynomials (e.g. Bessel’s functions, Fourier’s power series). Numerical integration of differential equations is based on their approximation through: • rational integral functions • trigonometric series The 1st, and more simple way, has been selected, using the method of integral analogues, which replaces mathematical operators and symbols in DE by the ratio of corresponding variables. Derivatives are replaced by the integral analogues, i.e. derivative operators are removed and simultaneously all operators are replaced by similarly or proportion marks in equations, all vectors are replaced by their absolute values. Dimensional terms are divided by some others, which results in searched non-dimensional likeness criterions [5]. ( ) ( ) m n m m n nnn i xxx xxxf xbb xxaxaxaxaa y ∂∂∂ ∂ = ++ ++++++ = + ... ),...,,( ... ...... 21 21 1 110 1 21122110 (5) n – combination degree of n-input variable polynomial of numerator m – combination degree of denominator (m