Inference in high-dimensional regression models without the exact or $L^p$ sparsity

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Jooyoung Cha, Harold D Chiang, Yuya Sasaki

Ngôn ngữ: eng

Ký hiệu phân loại: 512.5 Linear algebra

Thông tin xuất bản: 2021

Mô tả vật lý:

Bộ sưu tập: Metadata

ID: 167664

This paper proposes a new method of inference in high-dimensional regression models and high-dimensional IV regression models. Estimation is based on a combined use of the orthogonal greedy algorithm, high-dimensional Akaike information criterion, and double/debiased machine learning. The method of inference for any low-dimensional subvector of high-dimensional parameters is based on a root-$N$ asymptotic normality, which is shown to hold without requiring the exact sparsity condition or the $L^p$ sparsity condition. Simulation studies demonstrate superior finite-sample performance of this proposed method over those based on the LASSO or the random forest, especially under less sparse models. We illustrate an application to production analysis with a panel of Chilean firms.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH