论文标题

不知不觉中的转移学习:重新编程黑箱机器学习模型,数据稀缺和资源有限

Transfer Learning without Knowing: Reprogramming Black-box Machine Learning Models with Scarce Data and Limited Resources

论文作者

Tsai, Yun-Yun, Chen, Pin-Yu, Ho, Tsung-Yi

论文摘要

当前的转移学习方法主要基于具有目标域数据的预处理模型。由能够通过数据扰动来操纵模型预测的对抗性机器学习(ML)的技术的动机,在本文中,我们提出了一种新颖的方法,黑盒对抗性重编程(BAR),重新利用了培训良好的黑盒ML模型(例如,预测API或一个预测的求解),以求解了其他ML的模型,以求解不同的STENSLED STACKRIATY STACRIALS,尤其是在不同的情况下进行了MLS的尤其是任务,该模型尤其是MIL的尤其是MIL的任务。理由在于利用高性能但未知的ML模型来获得转移学习的学习能力。使用零订单优化和多标签映射技术,BAR可以仅根据其输入输出响应来重新编程Black-Box ML模型,而无需知道模型体系结构或更改任何参数。更重要的是,在有限的医学数据设置,自闭症谱系障碍分类,糖尿病性视网膜病变检测和黑色素瘤检测任务中,BAR的表现优于最先进的方法,并产生与Vanilla对抗性重编程方法相当的性能,需要完全了解目标ML模型。 Bar还以显着的余量优于基线转移学习方法,证明了经济有效的手段和新的转移学习见解。

Current transfer learning methods are mainly based on finetuning a pretrained model with target-domain data. Motivated by the techniques from adversarial machine learning (ML) that are capable of manipulating the model prediction via data perturbations, in this paper we propose a novel approach, black-box adversarial reprogramming (BAR), that repurposes a well-trained black-box ML model (e.g., a prediction API or a proprietary software) for solving different ML tasks, especially in the scenario with scarce data and constrained resources. The rationale lies in exploiting high-performance but unknown ML models to gain learning capability for transfer learning. Using zeroth order optimization and multi-label mapping techniques, BAR can reprogram a black-box ML model solely based on its input-output responses without knowing the model architecture or changing any parameter. More importantly, in the limited medical data setting, on autism spectrum disorder classification, diabetic retinopathy detection, and melanoma detection tasks, BAR outperforms state-of-the-art methods and yields comparable performance to the vanilla adversarial reprogramming method requiring complete knowledge of the target ML model. BAR also outperforms baseline transfer learning approaches by a significant margin, demonstrating cost-effective means and new insights for transfer learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源