Title

Approximation of kernel matrices in machine learning

Date of Award

2009

Degree Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Mathematics

Advisor(s)

Yuesheng Xu

Keywords

Kernel, Machine learning, Hilbert space

Subject Categories

Physical Sciences and Mathematics

Abstract

Kernels are popular in a variety of fields such as approximation, interpolation, meshless methods, neural networks and machine learning. A common problem of these kernel-based methods is to calculate the inverses of the matrices generated by a kernel function and a set of points. This work focuses on developing fast algorithms for calculating the inverses by approximating the kernel matrices with related multilevel circulant matrices so that the fast Fourier transform can apply to reduce the computational cost. In the first part of this thesis, we introduce two classes of matrices that contain the kernel matrices under different assumptions of the kernel function and the data points. Some properties of these two classes of matrices such as the approximation behavior of the elements and some functions of the inverses are presented. Moreover, we give the convergence analysis of approximating the kernel matrices with related multilevel circulant matrices based on properties of these two classes of matrices. The second part of this thesis talks about the applications of this approximation technique in machine learning. After introducing the formulation of two common problems in machine learning, we present some fast algorithms for these two problems and give the convergence analysis based on the results obtained in Part I.

Comments

ISBN 9781109507003

http://libezproxy.syr.edu/login?url=http://proquest.umi.com/pqdweb?index=9&sid=2&srchmode=1&vinst=PROD&fmt=6&startpage=-1&clientid=3739&vname=PQD&RQT=309&did=1913184141&scaling=FULL&ts=1283264809&vtype=PQD&rqt=309&TS=1283272712&clientId=3739

Share

COinS