论文标题

朝着新兴的语言符号语义细分和模型解释性

Towards Emergent Language Symbolic Semantic Segmentation and Model Interpretability

论文作者

Santamaria-Pang, Alberto, Kubricht, James, Chowdhury, Aritra, Bhushan, Chitresh, Tu, Peter

论文摘要

重点关注接地问题的方法的最新进展导致技术可用于构建与特定领域相关的符号语言。受到人类如何通过语言传达复杂思想的启发,我们开发了一种通用的符号语义($ \ text {s}^2 $)用于可解释的分割的框架。与对抗模型(例如gans)不同,我们明确地模拟了两个代理,一个发件人和一个接收器之间的合作,必须合作才能实现共同的目标。发件人从分割网络的高层接收信息,并生成从分类分布中得出的符号句子。接收器获得符号句子并共同生成分割掩码。为了使模型收敛,发件人和接收者必须学习使用私人语言进行交流。我们将架构应用于分段TCGA数据集中的肿瘤。 UNET样体系结构用于生成发件人网络的输入,该网络产生符号句子,而接收器网络基于句子将分割掩码共同生成。与最先进的分割方法相比,我们的分割框架的性能相似或更好。此外,我们的结果表明,直接解释符号句子以区分正常组织和肿瘤组织,肿瘤形态和其他图像特征。

Recent advances in methods focused on the grounding problem have resulted in techniques that can be used to construct a symbolic language associated with a specific domain. Inspired by how humans communicate complex ideas through language, we developed a generalized Symbolic Semantic ($\text{S}^2$) framework for interpretable segmentation. Unlike adversarial models (e.g., GANs), we explicitly model cooperation between two agents, a Sender and a Receiver, that must cooperate to achieve a common goal. The Sender receives information from a high layer of a segmentation network and generates a symbolic sentence derived from a categorical distribution. The Receiver obtains the symbolic sentences and co-generates the segmentation mask. In order for the model to converge, the Sender and Receiver must learn to communicate using a private language. We apply our architecture to segment tumors in the TCGA dataset. A UNet-like architecture is used to generate input to the Sender network which produces a symbolic sentence, and a Receiver network co-generates the segmentation mask based on the sentence. Our Segmentation framework achieved similar or better performance compared with state-of-the-art segmentation methods. In addition, our results suggest direct interpretation of the symbolic sentences to discriminate between normal and tumor tissue, tumor morphology, and other image characteristics.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源