Kernel canonical correlation analysis(CCA) is a nonlinear extension of CCA,which aims at extract-ing the information shared by two random variables. It has wide applications in many fields,such as information retrieval. This paper gives the convergence rate analysis of kernel CCA under some approximation conditions and some suggestions on how to choose the regularization parameter. The result shows that the convergence rate only depends on two parameters:the rate of regularization parameter and the decay rate of eigenvalues of compact operator VY X,and it gives better understanding of kernel CCA.
In regularized kernel methods, the solution of a learning problem is found by minimizing a functional consisting of a empirical risk and a regularization term. In this paper, we study the existence of optimal solution of multi-kernel regularization learning. First, we ameliorate a previous conclusion about this problem given by Micchelli and Pontil, and prove that the optimal solution exists whenever the kernel set is a compact set. Second, we consider this problem for Gaussian kernels with variance σ∈(0,∞), and give some conditions under which the optimal solution exists.