One of the main goals of machine learning is to study the generalization performance of learning algorithms. The previous main results describing the generalization ability of learning algorithms are usually based on independent and identically distributed (i.i.d.) samples. However, independence is a very restrictive concept for both theory and real-world applications. In this paper we go far beyond this classical framework by establishing the bounds on the rate of relative uniform convergence for the Empirical Risk Minimization (ERM) algorithm with uniformly ergodic Markov chain samples. We not only obtain generalization bounds of ERM algorithm, but also show that the ERM algorithm with uniformly ergodic Markov chain samples is consistent. The established theory underlies application of ERM type of learning algorithms.
This paper addresses the learning algorithm on the unit sphere.The main purpose is to present an error analysis for regression generated by regularized least square algorithms with spherical harmonics kernel.The excess error can be estimated by the sum of sample errors and regularization errors.Our study shows that by introducing a suitable spherical harmonics kernel,the regularization parameter can decrease arbitrarily fast with the sample size.