Kernel Ridge Riesz Representers: Generalization, Mis-specification, and the Counterfactual Effective Dimension

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Rahul Singh

Ngôn ngữ: eng

Ký hiệu phân loại: 512.5 Linear algebra

Thông tin xuất bản: 2021

Mô tả vật lý:

Bộ sưu tập: Báo, Tạp chí

ID: 166329

Kernel balancing weights provide confidence intervals for average treatment effects, based on the idea of balancing covariates for the treated group and untreated group in feature space, often with ridge regularization. Previous works on the classical kernel ridge balancing weights have certain limitations: (i) not articulating generalization error for the balancing weights, (ii) typically requiring correct specification of features, and (iii) justifying Gaussian approximation for only average effects. I interpret kernel balancing weights as kernel ridge Riesz representers (KRRR) and address these limitations via a new characterization of the counterfactual effective dimension. KRRR is an exact generalization of kernel ridge regression and kernel ridge balancing weights. I prove strong properties similar to kernel ridge regression: population $L_2$ rates controlling generalization error, and a standalone closed form solution that can interpolate. The framework relaxes the stringent assumption that the underlying regression model is correctly specified by the features. It extends Gaussian approximation beyond average effects to heterogeneous effects, justifying confidence sets for causal functions. I use KRRR to quantify uncertainty for heterogeneous treatment effects, by age, of 401(k) eligibility on assets.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH