Few-shot traffic classification based on autoencoder and deep graph convolutional networks.

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Yijie Bai, Jijie Han, Haoran Liu, Yilong Liu, Shengwei Xu

Ngôn ngữ: eng

Ký hiệu phân loại:

Thông tin xuất bản: England : Scientific reports , 2025

Mô tả vật lý:

Bộ sưu tập: NCBI

ID: 714313

Traffic classification is a crucial technique in network management that aims to identify and manage data packets to optimize network efficiency, ensure quality of service, enhance network security, and implement policy management. As graph convolutional networks (GCNs) take into account not only the features of the data itself, but also the relationships among sets of data during classification. Many researchers have proposed their own traffic classification methods based on GCN in recent years. However, most of the current approaches use two-layer GCN primarily due to the over-smoothing problem associated with deeper GCN. In scenarios with small samples, a two-layer GCN may not adequately capture relationships among traffic data, leading to limited classification performance. Additionally, during graph construction, traffic usually needs to be trimmed to a uniform length, and for traffic with insufficient length, zero-padding is typically applied to extension. This zero-padding strategy poses significant challenges in traffic classification with small samples. In this paper, we propose a method based on autoencoder (AE) and deep graph convolutional networks (ADGCN) for traffic classification for few-shot datasets. ADGCN first utilizes an AE to reconstruct the traffic. AE enables shorter traffic to learn abstract feature representations from longer traffic of the same class to replace zeros, mitigating the adverse effects of zero-padding. The reconstructed traffic is then classified using GCNII, a deep GCN model that addresses the challenge of insufficient data samples. ADGCN is an end-to-end traffic classification method applicable to various scenarios. According to experimental results, ADGCN can achieve a classification accuracy improvement of 3.5 to 24% compared to existing state-of-the-art methods. The code is available at https://github.com/han20011019/ADGCN .
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH