论文标题
诺伯特:通过BERT进行网络分析和管理的网络表示
NorBERT: NetwOrk Representations through BERT for Network Analysis and Management
论文作者
论文摘要
深度神经网络模型已非常成功地应用于自然语言处理(NLP)和基于图像的任务。他们在网络分析和管理任务中的应用是最近正在追求的。我们的兴趣是生产可有效概括以在不同环境中的多个网络任务上表现良好的深层模型。一个主要的挑战是,传统的深层模型通常依赖于分类特征,但不能处理看不见的分类价值。解决此类问题的一种方法是学习深层网络使用的分类变量的上下文嵌入,以提高其性能。在本文中,我们适应了NLP预训练技术和相关的深层模型BERT,以了解通信网络中使用的完全合格的域名(FQDN)的语义意义的数值表示(嵌入)。我们通过一系列实验表明,这种方法可用于生成模型,这些模型在应用于训练的环境以外的其他环境时保持其有效性。
Deep neural network models have been very successfully applied to Natural Language Processing (NLP) and Image based tasks. Their application to network analysis and management tasks is just recently being pursued. Our interest is in producing deep models that can be effectively generalized to perform well on multiple network tasks in different environments. A major challenge is that traditional deep models often rely on categorical features, but cannot handle unseen categorical values. One method for dealing with such problems is to learn contextual embeddings for categorical variables used by deep networks to improve their performance. In this paper, we adapt the NLP pre-training technique and associated deep model BERT to learn semantically meaningful numerical representations (embeddings) for Fully Qualified Domain Names (FQDNs) used in communication networks. We show through a series of experiments that such an approach can be used to generate models that maintain their effectiveness when applied to environments other than the one in which they were trained.