A new network with super-approximation power is introduced. This network is built with Floor () or ReLU () activation function in each neuron; hence, we call such networks Floor-ReLU networks. For any hyperparameters and , we show that Floor-ReLU networks with width and depth can uniformly approximate a Hölder function on with an approximation error , where and are the Hölder order and constant, respectively. More generally for an arbitrary continuous function on with a modulus of continuity , the constructive approximation rate is . As a consequence, this new class of networks overcomes the curse of dimensionality in approximation power when the variation of as is moderate (e.g., for Hölder continuous functions), since the major term to be considered in our approximation rate is essentially times a function of and independent of within the modulus of continuity.
Skip Nav Destination
Article navigation
March 2021
March 26 2021
Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth
In Special Collection:
CogNet
Zuowei Shen,
Zuowei Shen
Department of Mathematics, Purdue University, West Lafayette, IN 47907, USA, [email protected]
Search for other works by this author on:
Haizhao Yang,
Haizhao Yang
Department of Mathematics, Purdue University, West Lafayette, IN 47907, USA, [email protected]
Search for other works by this author on:
Shijun Zhang
Shijun Zhang
Department of Mathematics, Purdue University, West Lafayette, IN 47907, USA, [email protected]
Search for other works by this author on:
Zuowei Shen
Department of Mathematics, Purdue University, West Lafayette, IN 47907, USA, [email protected]
Haizhao Yang
Department of Mathematics, Purdue University, West Lafayette, IN 47907, USA, [email protected]
Shijun Zhang
Department of Mathematics, Purdue University, West Lafayette, IN 47907, USA, [email protected]
Received:
June 21 2020
Accepted:
October 26 2020
Online ISSN: 1530-888X
Print ISSN: 0899-7667
© 2021 Massachusetts Institute of Technology
2021
Massachusetts Institute of Technology
Neural Computation (2021) 33 (4): 1005–1036.
Article history
Received:
June 21 2020
Accepted:
October 26 2020
Citation
Zuowei Shen, Haizhao Yang, Shijun Zhang; Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth. Neural Comput 2021; 33 (4): 1005–1036. doi: https://doi.org/10.1162/neco_a_01364
Download citation file:
Sign in
Don't already have an account? Register
Client Account
You could not be signed in. Please check your email address / username and password and try again.
Could not validate captcha. Please try again.
Sign in via your Institution
Sign in via your InstitutionEmail alerts
Advertisement
Related Articles
ReLU Networks Are Universal Approximators via Piecewise Linear or Constant Functions
Neural Comput (November,2020)
Effect of Depth and Width on Local Minima in Deep Learning
Neural Comput (July,2019)
A New Class of Metrics for Spike Trains
Neural Comput (February,2014)
Optimal Tuning Widths in Population Coding of Periodic Variables
Neural Comput (July,2006)
Related Book Chapters
M-Width: Stability and Accuracy of Haptic Rendering of Virtual Mass
Robotics: Science and Systems VIII
A Pulse-Width Modulation Design Approach and Path-Programmable Logic for Artificial Neural Networks
Advanced Research in VLSI: Proceedings of the Fifth MIT Conference