Minimax Risk and Uniform Convergence Rates for Nonparametric Dyadic Regression

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Bryan S Graham, Fengshi Niu, James L Powell

Ngôn ngữ: eng

Ký hiệu phân loại: 003.75 Nonlinear systems

Thông tin xuất bản: 2020

Mô tả vật lý:

Bộ sưu tập: Metadata

ID: 165806

 Comment: 28 pagesLet $i=1,\ldots,N$ index a simple random sample of units drawn from some large population. For each unit we observe the vector of regressors $X_{i}$ and, for each of the $N\left(N-1\right)$ ordered pairs of units, an outcome $Y_{ij}$. The outcomes $Y_{ij}$ and $Y_{kl}$ are independent if their indices are disjoint, but dependent otherwise (i.e., "dyadically dependent"). Let $W_{ij}=\left(X_{i}',X_{j}'\right)'$
  using the sampled data we seek to construct a nonparametric estimate of the mean regression function $g\left(W_{ij}\right)\overset{def}{\equiv}\mathbb{E}\left[\left.Y_{ij}\right|X_{i},X_{j}\right].$ We present two sets of results. First, we calculate lower bounds on the minimax risk for estimating the regression function at (i) a point and (ii) under the infinity norm. Second, we calculate (i) pointwise and (ii) uniform convergence rates for the dyadic analog of the familiar Nadaraya-Watson (NW) kernel regression estimator. We show that the NW kernel regression estimator achieves the optimal rates suggested by our risk bounds when an appropriate bandwidth sequence is chosen. This optimal rate differs from the one available under iid data: the effective sample size is smaller and $d_W=\mathrm{dim}(W_{ij})$ influences the rate differently.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH