论文标题

多源深部域适应,对时间序列传感器数据的监督较弱

Multi-Source Deep Domain Adaptation with Weak Supervision for Time-Series Sensor Data

论文作者

Wilson, Garrett, Doppa, Janardhan Rao, Cook, Diane J.

论文摘要

域适应(DA)提供了一种有价值的手段,可以重用新问题域的数据和模型。但是,对于具有不同数据可用性的时间序列数据,尚未考虑强大的技术。在本文中,我们为填补这一空白做出了三个主要贡献。首先,我们为时间序列数据(CODATS)提出了一种新型的卷积深区适应模型,该模型显着提高了对现实世界传感器数据基准的最先进策略的准确性和训练时间。通过利用来自多个源域的数据,我们提高了密码器的实用性,以进一步提高先前单源方法的准确性,尤其是在域之间具有较高可变性的复杂时间序列数据集上。其次,我们通过以目标域标签分布的形式利用弱监督来提出一种新的域适应性(DA-WS)方法,而与其他数据标签相比,收集可能更容易收集。第三,我们对各种现实世界数据集进行了全面的实验,以评估我们的域适应性和弱监督方法的有效性。结果表明,单源DA的编码对最新方法显着改善,我们使用来自多个源域和弱监督信号的数据来实现准确性的进一步提高。代码可在以下网址找到:https://github.com/floft/codats

Domain adaptation (DA) offers a valuable means to reuse data and models for new problem domains. However, robust techniques have not yet been considered for time series data with varying amounts of data availability. In this paper, we make three main contributions to fill this gap. First, we propose a novel Convolutional deep Domain Adaptation model for Time Series data (CoDATS) that significantly improves accuracy and training time over state-of-the-art DA strategies on real-world sensor data benchmarks. By utilizing data from multiple source domains, we increase the usefulness of CoDATS to further improve accuracy over prior single-source methods, particularly on complex time series datasets that have high variability between domains. Second, we propose a novel Domain Adaptation with Weak Supervision (DA-WS) method by utilizing weak supervision in the form of target-domain label distributions, which may be easier to collect than additional data labels. Third, we perform comprehensive experiments on diverse real-world datasets to evaluate the effectiveness of our domain adaptation and weak supervision methods. Results show that CoDATS for single-source DA significantly improves over the state-of-the-art methods, and we achieve additional improvements in accuracy using data from multiple source domains and weakly supervised signals. Code is available at: https://github.com/floft/codats

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源