ICNPAA, ICNPAA WORLD CONGRESS 2020

Font Size: 
Multivariate Integral Kernel Separation Methodology: Tridiagonal Multivariate Integral Kernel Enhanced Multivariance Products Representation (TMIKEMPR)
Ayla Okan, Zeynep Gündoğar

Last modified: 2020-02-02

Abstract


In recent years, kernel functions have received major attention and have been used in many scientific research areas like machine learning, neural networks, genetic algorithm etc. They are mostly used with integration or summation procedures depending on the domain structures. Kernels are multivariate functions involving two different type independent variables. If a kernel is denoted by K(x,y) then the argument sets x and y contain same number of scalar independent variables. x and y variables have different tasks in integration or summation over them such that ys can be called “operand or operation” coordinates while xs can be considered as “image variables”. If the kernel satisfies the equality K(y,x) = K(x,y) then it is called symmetric. In this work, our purpose is to decompose a given kernel function in such a way that it can be expressed as a (finite or denumerable infinite) linear combination of functions each of which is a binary product of two functions each of which depends on only either x or y. One way to this end is the eigenfunction expansion while we prefer to develop a new TKEMPR-like algorithm which is based an appropriately defined four component partially bivariate EMPR expansion. This is done in such a way that the scalar independent variables are replaced by two set of scalars, x and y. Then this new bivariate EMPR is constructed through the same philosophy of usual bivariate EMPR by replacing singlefold integrations by N-tuple integrations. The remaining is exactly in the same way of tridiagonalization used in TKEMPR.
Both authors are very grateful to Professor Metin Demiralp for his invaluable supervising supports to this work during the studies.