A Simple and General Debiased Machine Learning Theorem with Finite Sample Guarantees

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Victor Chernozhukov, Whitney K Newey, Rahul Singh

Ngôn ngữ: eng

Ký hiệu phân loại: 511.4 Approximations formerly also 513.24 and expansions

Thông tin xuất bản: 2021

Mô tả vật lý:

Bộ sưu tập: Metadata

ID: 167083

Comment: Biometrika 2022Debiased machine learning is a meta algorithm based on bias correction and sample splitting to calculate confidence intervals for functionals, i.e. scalar summaries, of machine learning algorithms. For example, an analyst may desire the confidence interval for a treatment effect estimated with a neural network. We provide a nonasymptotic debiased machine learning theorem that encompasses any global or local functional of any machine learning algorithm that satisfies a few simple, interpretable conditions. Formally, we prove consistency, Gaussian approximation, and semiparametric efficiency by finite sample arguments. The rate of convergence is $n^{-1/2}$ for global functionals, and it degrades gracefully for local functionals. Our results culminate in a simple set of conditions that an analyst can use to translate modern learning theory rates into traditional statistical inference. The conditions reveal a general double robustness property for ill posed inverse problems.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH