Special Session 125: Analysis, Algorithms, and Applications of Neural Networks

Expressivity and Approximation Properties of Deep Neural Networks with ReLUk Activation
Tong Mao
King Abdullah University of Science and Technology
Saudi Arabia
Co-Author(s):    J. He, J. Xu
Abstract:
Deep ReLU$^k$ networks have the capability to represent higher-degree polynomials precisely. We provide a comprehensive constructive proof for polynomial representation using deep ReLU$^k$ networks. This allows us to establish an upper bound on both the size and count of network parameters. Consequently, we are able to demonstrate a suboptimal approximation rate for functions from Sobolev spaces as well as for analytic functions. Additionally, we reveal that deep ReLU$^k$ networks can approximate functions from a range of variation spaces, extending beyond those generated solely by the ReLU$^k$ activation function. This finding demonstrates the adaptability of deep ReLU$^k$ networks in approximating functions within various variation spaces.