论文标题

数据驱动的神经网络体系结构用于情感分析

A Data-driven Neural Network Architecture for Sentiment Analysis

论文作者

Çano, Erion, Morisio, Maurizio

论文摘要

与图像相关的任务中卷积神经网络的出色结果,引起了文本挖掘,情感分析和其他文本分析研究人员的关注。但是,在构建网络体系结构时,很难找到足够的数据来馈送此类网络,优化其参数并做出正确的设计选择。在本文中,我们介绍了两个大型歌曲情感数据集的创建步骤。我们还探讨了歌曲,产品和电影评论文本数据集的卷积和最大神经层的用法。还比较了简单且灵活的神经网络体系结构的三种变体。我们的目的是发现可以作为类似模型参数优化的指南的任何重要模式。我们还想确定架构设计选择,从而导致高性能分析模型。为此,我们对各种构型神经体系结构进行了一系列实验。我们的结果表明,最多三个的过滤长度的平行卷积通常足以捕获相关的文本功能。此外,最大式区域大小应适用于生成最佳功能地图的文本文档的长度。我们获得的最佳结果是通过长度为6到18的特征图获得的。对未来的神经网络模型进行情感分析的改进,可以使用对整个文本较小摘录的预测的聚合来产生文档的情感极性预测。

The fabulous results of convolution neural networks in image-related tasks, attracted attention of text mining, sentiment analysis and other text analysis researchers. It is however difficult to find enough data for feeding such networks, optimize their parameters, and make the right design choices when constructing network architectures. In this paper we present the creation steps of two big datasets of song emotions. We also explore usage of convolution and max-pooling neural layers on song lyrics, product and movie review text datasets. Three variants of a simple and flexible neural network architecture are also compared. Our intention was to spot any important patterns that can serve as guidelines for parameter optimization of similar models. We also wanted to identify architecture design choices which lead to high performing sentiment analysis models. To this end, we conducted a series of experiments with neural architectures of various configurations. Our results indicate that parallel convolutions of filter lengths up to three are usually enough for capturing relevant text features. Also, max-pooling region size should be adapted to the length of text documents for producing the best feature maps. Top results we got are obtained with feature maps of lengths 6 to 18. An improvement on future neural network models for sentiment analysis, could be generating sentiment polarity prediction of documents using aggregation of predictions on smaller excerpt of the entire text.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源