Neural networks trained by weight permutation are universal approximators.

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Yongqiang Cai, Gaohang Chen, Zhonghua Qiao

Ngôn ngữ: eng

Ký hiệu phân loại: 551.561 Electricity in stable atmosphere

Thông tin xuất bản: United States : Neural networks : the official journal of the International Neural Network Society , 2025

Mô tả vật lý:

Bộ sưu tập: NCBI

ID: 713521

The universal approximation property is fundamental to the success of neural networks, and has traditionally been achieved by training networks without any constraints on their parameters. However, recent experimental research proposed a novel permutation-based training method, which exhibited a desired classification performance without modifying the exact weight values. In this paper, we provide a theoretical guarantee of this permutation training method by proving its ability to guide a ReLU network to approximate one-dimensional continuous functions. Our numerical results further validate this method's efficiency in regression tasks with various initializations. The notable observations during weight permutation suggest that permutation training can provide an innovative tool for describing network learning behavior.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH