Constructions of reproducing kernel Banach Spaces via generalized Mercer kernels

Yuesheng Xu and Qi Ye

Abstract: This article mainly studies with the constructions of reproducing kernel Banach spaces (RKBSs) that are the generalization of reproducing kernel Hilbert spaces (RKHSs). Firstly, we verify many advance properties of the general RKBSs such as density, continuity, implicit representation, imbedding, compactness, and representer theorems for learning methods. Next, we develop a new concept of generalized Mercer kernels to construct the $p$-norm RKBSs for $1\leq p\leq\infty$. The $p$-norm RKBSs preserve the same simple format as the Mercer representation of RKHSs. Moreover, the p-norm RKBSs are isometrically equivalent to the standard p-norm spaces of countable sequences; hence the $p$-norm RKBSs possess more abundant geometrical structures than RKHSs including sparsity. To be more precise, the suitable countable expansion terms of the generalized Mercer kernels can be used to represent the pairs of Schauder bases and biorthogonal systems of the $p$-norm RKBSs such that the generalized Mercer kernels become the reproducing kernels of the $p$-norm RKBSs. The theory of the generalized Mercer kernels also cover many well-known kernels, for instance, min kernels, Gaussian kernels, and power series kernels. Finally, we propose to solve the support vector machines in the $p$-norm RKBSs, which are to minimize the regularized empirical risks over the $p$-norm RKBSs. We show that the infinite-dimensional support vector machines in the $p$-norm RKBSs can be equivalently transferred into the finite dimensional convex optimization problems such that we can obtain the finite dimensional representations of their support vector machine solutions for practical applications and computer programs. In particular, we verify that some typical support vector machines in the $1$-norm RKBSs are equivalent to the classical $1$-norm sparse regressions.

ArXiv: 1412.8663