论文标题

整个MILC:概括跨任务,数据集和人群的动态

Whole MILC: generalizing learned dynamics across tasks, datasets, and populations

论文作者

Mahmood, Usman, Rahman, Md Mahfuzur, Fedorov, Alex, Lewis, Noah, Fu, Zening, Calhoun, Vince D., Plis, Sergey M.

论文摘要

行为变化是精神障碍的最早迹象,但可以说,大脑功能的动态甚至更早就会受到影响。随后,疾病特异性动力学的时空结构对于早期诊断和理解疾病机制至关重要。学习歧视性特征的一种常见方法依赖于培训分类器并评估特征重要性。基于手工特征的经典分类器非常强大,但是当应用于时空数据的大型输入维度时,会遭受维数的诅咒。深度学习算法可以解决这个问题,模型的内省可以突出歧视性时空区域,但需要更多的样本进行训练。在本文中,我们提出了一种新颖的自我监督培训模式,该模式加强了整个序列相互信息本地的上下文(整个MILC)。我们将整个MILC模型预先介绍在无标记和无关的健康对照数据上。我们测试了三种不同疾病的模型(i)精神分裂症(II)自闭症和(iii)阿尔茨海默氏症和四个不同的研究。我们的算法优于现有的自我监督预训练方法,并为经典的机器学习算法提供竞争性分类结果。重要的是,整个MILC可以将受试者诊断归因于fMRI信号中特定时空区域。

Behavioral changes are the earliest signs of a mental disorder, but arguably, the dynamics of brain function gets affected even earlier. Subsequently, spatio-temporal structure of disorder-specific dynamics is crucial for early diagnosis and understanding the disorder mechanism. A common way of learning discriminatory features relies on training a classifier and evaluating feature importance. Classical classifiers, based on handcrafted features are quite powerful, but suffer the curse of dimensionality when applied to large input dimensions of spatio-temporal data. Deep learning algorithms could handle the problem and a model introspection could highlight discriminatory spatio-temporal regions but need way more samples to train. In this paper we present a novel self supervised training schema which reinforces whole sequence mutual information local to context (whole MILC). We pre-train the whole MILC model on unlabeled and unrelated healthy control data. We test our model on three different disorders (i) Schizophrenia (ii) Autism and (iii) Alzheimers and four different studies. Our algorithm outperforms existing self-supervised pre-training methods and provides competitive classification results to classical machine learning algorithms. Importantly, whole MILC enables attribution of subject diagnosis to specific spatio-temporal regions in the fMRI signal.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源