论文标题
约束神经网络逆问题的外部附属物指导的优化方法
An Outer-approximation Guided Optimization Approach for Constrained Neural Network Inverse Problems
论文作者
论文摘要
本文讨论了针对约束神经网络逆问题的外部应用指导的优化方法。受约束的神经网络反问题是指优化问题,以找到给定训练有素的神经网络的最佳输入值集,以便在限制输入值的情况下产生预定义的所需输出。本文分析了带有整流激活单元的神经网络反问题的最佳解决方案的特征,并通过利用其特征提出了外部应用算法。所提出的外氧化引导优化包括原始和偶阶段。原始阶段将邻居曲率与邻居外透明化相结合,以加快该过程。双阶段识别并利用局部凸区域的结构来改善与局部最佳解决方案的收敛性。最后,计算实验证明了与预计的梯度方法相比,所提出的算法的优越性。
This paper discusses an outer-approximation guided optimization method for constrained neural network inverse problems with rectified linear units. The constrained neural network inverse problems refer to an optimization problem to find the best set of input values of a given trained neural network in order to produce a predefined desired output in presence of constraints on input values. This paper analyzes the characteristics of optimal solutions of neural network inverse problems with rectified activation units and proposes an outer-approximation algorithm by exploiting their characteristics. The proposed outer-approximation guided optimization comprises primal and dual phases. The primal phase incorporates neighbor curvatures with neighbor outer-approximations to expedite the process. The dual phase identifies and utilizes the structure of local convex regions to improve the convergence to a local optimal solution. At last, computation experiments demonstrate the superiority of the proposed algorithm compared to a projected gradient method.