Extracting formulae in many-valued logic from deep neural networks

Authors

Yani Zhang and Helmut Bölcskei

Reference

IEEE Transactions on Signal Processing, Mar. 2025, submitted.

[BibTeX, LaTeX, and HTML Reference]

Abstract

We propose a new perspective on deep rectified linear unit (ReLU) networks, namely as circuit counterparts of Łukasiewicz infinite-valued logic—--a many-valued (MV) generalization of Boolean logic. An algorithm for extracting formulae in MV logic from trained deep ReLU networks is presented. The algorithm respects the network architecture, in particular compositionality, thereby honoring algebraic information present in the training data. We also establish the representation benefits of deep networks from a mathematical logic perspective.

Keywords

Mathematical logic, many-valued logic, McNaughton functions, deep neural networks


Download this document:

 

Copyright Notice: © 2025 Y. Zhang and H. Bölcskei.

This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.