Page 81 - Textos de Matemática Vol. 47
P. 81

REGRESSION ESTIMATION FOR FUNCTIONAL DATA 71 It is easily verified that F0(x,z) = 0 if z 6= 1, and F0(x,1) = +1.
Thus,  (x,z,h) =   + k  z  1exp h k  (zh) k  if z 6= 1, and zkhk
 (x,1,h) =  1. Remark that the Dominated Convergence Theorem does not apply, as the functions  (x,·,h) are not uniformly bounded with respect to h. Now, if we assume K is di↵erentiable, it follows, using integration by parts, that
Z1 0   1   k  k   (K,x,h)=K(1)  K (z)z exp h  (zh) dz.
0
The popular choice K(z) = I[0,1](z) leads to  (K,x,h) = 1, thus not converging to 0. However, this di culty may be overcome by choosing K(z) = 1   z✓, withZ ✓   max(0, 2   (  + k)), as in such case
11   k  k  k  (K,x,h)✓ zk+1 exp h  (zh) dz=✓h .
0
That is, for this model we have a furthter argument to be interested in
kernels such that the weights approach 0.
(4) Assume S = L2([0,1]d) and X is a Gaussian process on S,  (u,v) =
Cov(X(u), X(v)) isRthe auto-covariance function of X, and consider the operator ⇤f(u)= [0,1]d  (u,v)f(v) d(dv),f 2Sand d theLebesgue
measure on [0, 1]d. If x belongs to the reproducing Hilbert space induced
by   then P(kx Xk  h) = P(kXk  h) (see Li and Shao [13]), so
we may shift the small-ball problem to the origin. Assume the operator
 ⇤ haseigenvalues n,n 1,and#{n: n >t}⇠'(1),where' t
is a slowly varying function. In this case, it has been shoRwn recently by Karol and Nazarov [12] that P(kXk  h) ⇠ exp⇣ 1 u '(z) dz⌘,
21z
where '(u) ⇠ h. Assume that  n ⇠ n ↵, ↵ > 1 (the Brownian motion
2u
corresponds to ↵ = 2). Then '(z) = z1/↵ and it is easily verified that
1 ↵ 1 ↵ P(kXk  h) ⇠ e↵/2 exp⇣ 22↵ 1 ↵h ↵ ⌘,
which corresponds to model (3) with   = 0 and k = ↵ . ↵ 1
3. Convergence of the estimator
Using Lemma 2.1 in [10] it follows that, for " > 0 small enough, n  b o
    rb n ( x )   E gb n ( x )     > "
n E f n ( x ) o n         ( E fb ( x ) ) 2 o
(3.1)
⇢ |gb(x) Egb(x)|>"Efb(x) [  fb(x) Efb(x) >" n . n n 4n n n 4Egbn(x)


































































































   79   80   81   82   83