Metric-Entropy limits on nonlinear dynamical system learning
Authors
Yang Pan, Clemens Hutter, and Helmut BölcskeiReference
Information Theory, Probability and Statistical Learning: A Festschrift in Honor of Andrew Barron, Springer, June 2024, submitted, (invited paper).[BibTeX, LaTeX, and HTML Reference]
Abstract
This paper is concerned with the fundamental limits of nonlinear dynamical system learning from input-output traces. Specifically, we show that recurrent neural networks (RNNs) are capable of learning nonlinear systems that satisfy a Lipschitz property and forget past inputs fast enough in a metric-entropy optimal manner. As the sets of sequence-to-sequence maps realized by the dynamical systems we consider are significantly more massive than function classes generally considered in deep neural network approximation theory, a refined metric-entropy characterization is needed, namely in terms of order, type, and generalized dimension. We compute these quantities for the classes of exponentially-decaying and polynomially-decaying Lipschitz fading-memory systems and show that RNNs can achieve them.Keywords
Nonlinear dynamical systems, recurrent neural networks, metric entropy, fading-memory systems, neural network theory, quantization
Download this document:
Copyright Notice: © 2024 Y. Pan, C. Hutter, and H. Bölcskei.
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.