论文标题
技术报告:结合培训期间转移学习的知识和广泛的重新设备
Technical Report: Combining knowledge from Transfer Learning during training and Wide Resnets
论文作者
论文摘要
在本报告中,我们结合了广泛的重新设备和转移学习的想法,以优化深神经网络的体系结构。该体系结构的第一个改进是将所有图层用作最后一层的信息源。这个想法来自转移学习,该学习使用在其他数据上预先培训的网络,并将网络的不同级别提取作为新任务的输入。第二个改进是使用更深的层而不是更深的块序列。这个想法来自广泛的重新设备。使用两种优化,高数据增强和标准数据增强都可以为不同模型产生更好的结果。 链接:https://github.com/wolfgangfuhl/publicationstuff/tree/master/technicalreport1/supp
In this report, we combine the idea of Wide ResNets and transfer learning to optimize the architecture of deep neural networks. The first improvement of the architecture is the use of all layers as information source for the last layer. This idea comes from transfer learning, which uses networks pre-trained on other data and extracts different levels of the network as input for the new task. The second improvement is the use of deeper layers instead of deeper sequences of blocks. This idea comes from Wide ResNets. Using both optimizations, both high data augmentation and standard data augmentation can produce better results for different models. Link: https://github.com/wolfgangfuhl/PublicationStuff/tree/master/TechnicalReport1/Supp