论文标题

时间序列域通过稀疏关联结构对齐适应

Time Series Domain Adaptation via Sparse Associative Structure Alignment

论文作者

Cai, Ruichu, Chen, Jiawei, Li, Zijian, Chen, Wei, Zhang, Keli, Ye, Junjian, Li, Zhuozhang, Yang, Xiaoyan, Zhang, Zhenjie

论文摘要

时间序列数据适应域是一项重要但具有挑战性的任务。该领域的大多数现有作品都是基于学习域不变的数据的数据,该表示借助MMD等限制。但是,由于时间戳之间的复杂依赖性,这种域不变表示的这种提取是时间序列数据的一项非平凡任务。详细说明,在完全依赖的时间序列中,时间滞后或偏移的少量变化可能会导致域不变提取的困难。幸运的是,因果关系的稳定性激发了我们探索数据的域不变结构。为了减少发现因果结构的难度,我们将其放松到稀疏的关联结构中,并提出了一种新型的稀疏关联结构对准模型,以适应域的适应性。首先,我们生成段设置以排除偏移的障碍。其次,在考虑时间滞后的情况下,设计了变异内变异和变异稀疏注意机制来提取关联结构时间序列数据。最后,使用关联结构比对来指导知识从源域转移到目标域。实验研究不仅验证了我们在三个现实世界数据集上的方法的良好性能,而且还提供了有关转移知识的一些有见地的发现。

Domain adaptation on time series data is an important but challenging task. Most of the existing works in this area are based on the learning of the domain-invariant representation of the data with the help of restrictions like MMD. However, such extraction of the domain-invariant representation is a non-trivial task for time series data, due to the complex dependence among the timestamps. In detail, in the fully dependent time series, a small change of the time lags or the offsets may lead to difficulty in the domain invariant extraction. Fortunately, the stability of the causality inspired us to explore the domain invariant structure of the data. To reduce the difficulty in the discovery of causal structure, we relax it to the sparse associative structure and propose a novel sparse associative structure alignment model for domain adaptation. First, we generate the segment set to exclude the obstacle of offsets. Second, the intra-variables and inter-variables sparse attention mechanisms are devised to extract associative structure time-series data with considering time lags. Finally, the associative structure alignment is used to guide the transfer of knowledge from the source domain to the target one. Experimental studies not only verify the good performance of our methods on three real-world datasets but also provide some insightful discoveries on the transferred knowledge.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源