EAPT: An encrypted traffic classification model via adversarial pre-trained transformers

IF 4.6 2区 计算机科学 Q1 COMPUTER SCIENCE, HARDWARE & ARCHITECTURE Computer Networks Pub Date : 2025-02-01 Epub Date: 2024-12-09 DOI:10.1016/j.comnet.2024.110973
Mingming Zhan , Jin Yang , Dongqing Jia , Geyuan Fu
{"title":"EAPT: An encrypted traffic classification model via adversarial pre-trained transformers","authors":"Mingming Zhan ,&nbsp;Jin Yang ,&nbsp;Dongqing Jia ,&nbsp;Geyuan Fu","doi":"10.1016/j.comnet.2024.110973","DOIUrl":null,"url":null,"abstract":"<div><div>Encrypted traffic classification plays a critical role in network traffic management and optimization, as it helps identify and differentiate between various types of traffic, thereby enhancing the quality and efficiency of network services. However, with the continuous evolution of traffic encryption and network applications, a large and diverse volume of encrypted traffic has emerged, presenting challenges for traditional feature extraction-based methods in identifying encrypted traffic effectively. This paper introduces an encrypted traffic classification model via adversarial pre-trained transformers-EAPT. The model utilizes the SentencePiece to tokenize encrypted traffic data, effectively addressing the issue of coarse tokenization granularity, thereby ensuring that the tokenization results more accurately reflect the characteristics of the encrypted traffic. During the pre-training phase, the EAPT employs a disentangled attention mechanism and incorporates a pre-training task similar to generative adversarial networks called Replaced BURST Detection. This approach not only enhances the model’s ability to understand contextual information but also accelerates the pre-training process. Additionally, this method minimizes model parameters, thus improving the model’s generalization capability. Experimental results show that EAPT can efficiently learn traffic features from small-scale unlabeled datasets and demonstrate excellent performance across multiple datasets with a relatively small number of model parameters.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":"257 ","pages":"Article 110973"},"PeriodicalIF":4.6000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1389128624008053","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/12/9 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0

Abstract

Encrypted traffic classification plays a critical role in network traffic management and optimization, as it helps identify and differentiate between various types of traffic, thereby enhancing the quality and efficiency of network services. However, with the continuous evolution of traffic encryption and network applications, a large and diverse volume of encrypted traffic has emerged, presenting challenges for traditional feature extraction-based methods in identifying encrypted traffic effectively. This paper introduces an encrypted traffic classification model via adversarial pre-trained transformers-EAPT. The model utilizes the SentencePiece to tokenize encrypted traffic data, effectively addressing the issue of coarse tokenization granularity, thereby ensuring that the tokenization results more accurately reflect the characteristics of the encrypted traffic. During the pre-training phase, the EAPT employs a disentangled attention mechanism and incorporates a pre-training task similar to generative adversarial networks called Replaced BURST Detection. This approach not only enhances the model’s ability to understand contextual information but also accelerates the pre-training process. Additionally, this method minimizes model parameters, thus improving the model’s generalization capability. Experimental results show that EAPT can efficiently learn traffic features from small-scale unlabeled datasets and demonstrate excellent performance across multiple datasets with a relatively small number of model parameters.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
EAPT:通过对抗性预训练变压器的加密流量分类模型
加密流分类可以识别和区分不同类型的流量,从而提高网络服务的质量和效率,在网络流量管理和优化中起着至关重要的作用。然而,随着流量加密技术和网络应用的不断发展,大量不同类型的加密流量不断涌现,传统的基于特征提取的加密流量识别方法在有效识别加密流量方面面临挑战。介绍了一种基于对抗性预训练变换的加密流量分类模型——eapt。该模型利用sentencepece对加密流量数据进行标记,有效地解决了标记粒度粗的问题,从而确保标记结果更准确地反映加密流量的特征。在预训练阶段,EAPT采用了一种解纠缠的注意力机制,并结合了一个类似于生成对抗网络的预训练任务,称为替代BURST检测。这种方法不仅提高了模型理解上下文信息的能力,而且加快了预训练过程。此外,该方法使模型参数最小化,从而提高了模型的泛化能力。实验结果表明,EAPT可以有效地从小规模的未标记数据集中学习交通特征,并且在模型参数相对较少的多个数据集上表现出优异的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Computer Networks
Computer Networks 工程技术-电信学
CiteScore
10.80
自引率
3.60%
发文量
434
审稿时长
8.6 months
期刊介绍: Computer Networks is an international, archival journal providing a publication vehicle for complete coverage of all topics of interest to those involved in the computer communications networking area. The audience includes researchers, managers and operators of networks as well as designers and implementors. The Editorial Board will consider any material for publication that is of interest to those groups.
期刊最新文献
From simulation to deep learning: Survey on network performance modeling approaches Eco-efficient task scheduling for MLLMs in edge-cloud continuum A multimodal and perturbation-aware learning approach for robust traffic classification DACC: Discerning and adaptive offloading for coarse-grained content-aware video analytics Preemption-aware online AoI scheduling over two-state Markov channels
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1