Let SFd and ∏φ,n,d ={∑j^n=1bjφ(wj.x+θj):bj,θj∈R,wj∈R^d} be the set of periodic and Lebesgue's square-integrable functions and the set of feedforward neural network (FNN) functions, respectively. Denote by dist (SFd , ∏φ,n,d) the deviation of the set SFd from the set ∏φ,n,d. A main purpose of this paper is to estimate the deviation. In particular, based on the Fourier transforms and the theory of approximation, a lower estimation for dist (SFd and ∏φ,n,d) is proved. That is, dist(SFd and ∏φ,n,d) ≥C/(nlog2n)1/2. The obtained estimation depends only on the number of neuron in the hidden layer, and is independent of the approximated target functions and dimensional number of input. This estimation also reveals the relationship between the approximation rate of FNNs and the topology structure of hidden layer.
In this paper, we investigate the radial function manifolds generated by a linear combination of radial functions. Let Wp^r(B^d) be the usual Sobolev class of functions on the unit ball 54. We study the deviation from the radial function manifolds to WP^r(b^d). Our results show that the upper and lower bounds of approximation by a linear combination of radial functions are asymptotically identical. We also find that the radial function manifolds and ridge function manifolds generated by a linear combination of ridge functions possess the same rate of approximation.