Stable architectures for deep neural networks

被引:330
作者
Haber, Eldad [1 ,3 ]
Ruthotto, Lars [2 ,3 ]
机构
[1] Univ British Columbia, Dept Earth & Ocean Sci, Vancouver, BC, Canada
[2] Emory Univ, Dept Math & Comp Sci, Atlanta, GA 30322 USA
[3] Xtract Technol Inc, Vancouver, BC, Canada
基金
美国国家科学基金会;
关键词
machine learning; deep neural networks; dynamic inverse problems; PDE-constrained optimization; parameter estimation; image classification; OPTIMIZATION METHODS;
D O I
10.1088/1361-6420/aa9a90
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Deep neural networks have become invaluable tools for supervised machine learning, e.g. classification of text or images. While often offering superior results over traditional techniques and successfully expressing complicated patterns in data, deep architectures are known to be challenging to design and train such that they generalize well to new data. Critical issues with deep architectures are numerical instabilities in derivative-based learning algorithms commonly called exploding or vanishing gradients. In this paper, we propose new forward propagation techniques inspired by systems of ordinary differential equations (ODE) that overcome this challenge and lead to well-posed learning problems for arbitrarily deep networks. The backbone of our approach is our interpretation of deep learning as a parameter estimation problem of nonlinear dynamical systems. Given this formulation, we analyze stability and well-posedness of deep learning and use this new understanding to develop new network architectures. We relate the exploding and vanishing gradient phenomenon to the stability of the discrete ODE and present several strategies for stabilizing deep learning for very deep networks. While our new architectures restrict the solution space, several numerical experiments show their competitiveness with state-of-the-art networks.
引用
收藏
页数:22
相关论文
共 48 条
  • [1] Abu-Mostafa Y S, 2012, LEARNING FROM DATA, V4
  • [2] [Anonymous], 2016, INT C LEARN REPR
  • [3] [Anonymous], 2004, INT C MACH LEARN
  • [4] [Anonymous], 2016, ARXIV160604838
  • [5] [Anonymous], 2009, NIPS
  • [6] Ascher U., 1995, Numerical solution of boundary value problems for ordinary differential equations
  • [7] Ascher U M., 2010, Numerical methods for Evolutionary Differential Equations
  • [8] Ascher U.M., 1998, Computer methods for ordinary differential equations and differential-algebraic equations, V61, DOI DOI 10.1137/1.9781611971392
  • [9] LEARNING LONG-TERM DEPENDENCIES WITH GRADIENT DESCENT IS DIFFICULT
    BENGIO, Y
    SIMARD, P
    FRASCONI, P
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02): : 157 - 166
  • [10] Learning Deep Architectures for AI
    Bengio, Yoshua
    [J]. FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01): : 1 - 127