论文标题

对多元时间序列的对抗性攻击

Adversarial Attacks on Multivariate Time Series

论文作者

Harford, Samuel, Karim, Fazle, Darabi, Houshang

论文摘要

多元时间序列的分类模型在研究社区中已获得了重要的重要性,但是对于为这些模型生成对抗样本的研究并没有进行太多研究。这样的对手样本可能会成为安全问题。在本文中,我们建议在蒸馏模型上转换现有的对抗转换网络(ATN),以攻击各种多元时间序列分类模型。对分类模型的拟议攻击利用蒸馏模型作为模仿攻击经典多元时间序列分类模型的行为的替代物。对所提出的方法进行了测试到1年最新的邻居动态时间扭曲(1-NN DTW)和完全卷积网络(FCN),所有这些方法均经过了东英吉利大学(UEA)18大学(UEA)和加利福尼亚大学(UCR)(UCR)(UCR)数据集进行了培训。我们显示,这两种模型都容易受到所有18个数据集的攻击。据我们所知,对抗性攻击仅在单变量时间序列的领域进行,并且没有在多元时间序列上进行。对时间序列分类模型的这种攻击从未做过。此外,我们建议未来的研究人员开发时间序列分类模型,以将对抗数据样本纳入其训练数据集中,以提高对对抗样本的弹性,并将模型鲁棒性视为评估度量。

Classification models for the multivariate time series have gained significant importance in the research community, but not much research has been done on generating adversarial samples for these models. Such samples of adversaries could become a security concern. In this paper, we propose transforming the existing adversarial transformation network (ATN) on a distilled model to attack various multivariate time series classification models. The proposed attack on the classification model utilizes a distilled model as a surrogate that mimics the behavior of the attacked classical multivariate time series classification models. The proposed methodology is tested onto 1-Nearest Neighbor Dynamic Time Warping (1-NN DTW) and a Fully Convolutional Network (FCN), all of which are trained on 18 University of East Anglia (UEA) and University of California Riverside (UCR) datasets. We show both models were susceptible to attacks on all 18 datasets. To the best of our knowledge, adversarial attacks have only been conducted in the domain of univariate time series and have not been conducted on multivariate time series. such an attack on time series classification models has never been done before. Additionally, we recommend future researchers that develop time series classification models to incorporating adversarial data samples into their training data sets to improve resilience on adversarial samples and to consider model robustness as an evaluative metric.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源