Estimation and Inference by Stochastic Optimization

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Jean-Jacques Forneron

Ngôn ngữ: eng

Ký hiệu phân loại: 003.76 Stochastic systems

Thông tin xuất bản: 2022

Mô tả vật lý:

Bộ sưu tập: Metadata

ID: 195084

Comment: This paper and Forneron and Ng (2021) supersede the manuscript "Inference by Stochastic Optimization: A Free-Lunch Bootstrap" (Forneron and Ng, 2020)In non-linear estimations, it is common to assess sampling uncertainty by bootstrap inference. For complex models, this can be computationally intensive. This paper combines optimization with resampling: turning stochastic optimization into a fast resampling device. Two methods are introduced: a resampled Newton-Raphson (rNR) and a resampled quasi-Newton (rqN) algorithm. Both produce draws that can be used to compute consistent estimates, confidence intervals, and standard errors in a single run. The draws are generated by a gradient and Hessian (or an approximation) computed from batches of data that are resampled at each iteration. The proposed methods transition quickly from optimization to resampling when the objective is smooth and strictly convex. Simulated and empirical applications illustrate the properties of the methods on large scale and computationally intensive problems. Comparisons with frequentist and Bayesian methods highlight the features of the algorithms.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH