The full text of this article is unavailable through your IP address: 172.17.0.1
Contents Online
Communications in Mathematical Sciences
Volume 22 (2024)
Number 2
Reproducing activation function for deep learning
Pages: 285 – 314
DOI: https://dx.doi.org/10.4310/CMS.2024.v22.n2.a1
Authors
Abstract
We propose reproducing activation functions (RAFs) motivated by applied and computational harmonic analysis to improve deep learning accuracy for various applications ranging from computer vision to scientific computing. The idea is to employ several basic functions and their learnable linear combination to construct neuron-wise data-driven activation functions for each neuron. Armed with RAFs, neural networks (NNs) can reproduce traditional approximation tools and, therefore, approximate target functions with a smaller number of parameters than traditional NNs. As demonstrated by extensive numerical tests, the proposed RAFs can facilitate the convergence of deep learning optimization for a solution with higher accuracy than existing deep learning solvers for audio/image/video reconstruction, PDEs, and eigenvalue problems. With RAFs, the errors of audio/video reconstruction, PDEs, and eigenvalue problems are decreased by over 14%, 73%, 99%, respectively, compared with baseline, while the performance of image reconstruction increases by 58%. Numerically, in the NN training, RAFs can generate neural tangent kernels with better condition numbers than traditional activation functions, which provides a prospective for understanding the improved optimization convergence using the theory of neural tangent kernel. The code is available $\href{https://github.com/LeungSamWai/Reproducing-Activation-Function}{https://github.com/LeungSamWai/Reproducing-Activation-Function}$.
Keywords
deep neural network, activation function, neural tangent kernel, polynomials, Fourier basis, wavelets, radial basis functions
2010 Mathematics Subject Classification
62M45, 65M75, 65N75
Received 15 June 2022
Received revised 19 June 2023
Accepted 19 June 2023
Published 1 February 2024