论文标题

相关矢量机具有弱信息性的超级优势和扩展的预测信息标准

Relevance Vector Machine with Weakly Informative Hyperprior and Extended Predictive Information Criterion

论文作者

Murayama, Kazuaki., Kawano, Shuichi.

论文摘要

在变异相关矢量机中,伽马分布代表了比在自动相关性确定之前的噪声精度的高位。我们建议将逆伽玛高位与形状参数接近零的形状参数和比例参数无需接近零的比例参数。这种超级优势与弱信息的概念有关。通过回归对非殖民数据的回归研究了这种高位的影响。由于很难使用单个内核函数捕获此类数据的结构,因此我们应用多个内核方法,其中为输入数据安排了具有不同宽度的多个内核函数。我们确认模型中的自由度可以通过调整比例参数并保持形状参数接近零来控制。选择比例参数的候选者是预测信息标准。但是,使用此标准的估计模型似乎会导致过度拟合。这是因为多个内核方法使模型成为模型尺寸大于数据大小的情况。为了选择适当的比例参数,即使在这种情况下,我们还提出了扩展的预测信息标准。可以通过选择最小化扩展预测信息标准的比例参数来确认具有良好预测精度的多个内核相关矢量回归模型。

In the variational relevance vector machine, the gamma distribution is representative as a hyperprior over the noise precision of automatic relevance determination prior. Instead of the gamma hyperprior, we propose to use the inverse gamma hyperprior with a shape parameter close to zero and a scale parameter not necessary close to zero. This hyperprior is associated with the concept of a weakly informative prior. The effect of this hyperprior is investigated through regression to non-homogeneous data. Because it is difficult to capture the structure of such data with a single kernel function, we apply the multiple kernel method, in which multiple kernel functions with different widths are arranged for input data. We confirm that the degrees of freedom in a model is controlled by adjusting the scale parameter and keeping the shape parameter close to zero. A candidate for selecting the scale parameter is the predictive information criterion. However the estimated model using this criterion seems to cause over-fitting. This is because the multiple kernel method makes the model a situation where the dimension of the model is larger than the data size. To select an appropriate scale parameter even in such a situation, we also propose an extended prediction information criterion. It is confirmed that a multiple kernel relevance vector regression model with good predictive accuracy can be obtained by selecting the scale parameter minimizing extended prediction information criterion.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源