Neural Network Theory
Offered in:
- Data Science Master: Information and Learning
- Electrical Engineering and Information Technology Master: Core Courses (Kernfächer)
- Electrical Engineering and Information Technology Master: Advanced Core Courses
- Electrical Engineering and Information Technology Master: Specialization Courses (Vertiefungsfächer)
- Electrical Engineering and Information Technology Master: Recommended Subjects (Empfohlene Fächer)
- Computer Science Master: Computer Science Elective Courses
- Mathematics Master: Selection: Further Realms (Auswahl: Weitere Gebiete)
- Physics Master: General Electives (Allgemeine Wahlfächer)
- Computational Science and Engineering Master: Electives (Wahlfächer)
- Statistics Master: Statistical and Mathematical Courses (Statistische und mathematische Fächer)
Basic information:
Lecture: | Tuesday, 10:15-12:00, HG F 5. |
Exercise session: | Tuesday, 12:15-13:00, HG F 5. |
Instructors: | Prof. Dr. Helmut Bölcskei |
Teaching assistants: | Valentin Abadie |
Office hours: | Thursday, 16:15-17:00 in ETF E 114. Please contact the TA if you are planning to attend. |
Lecture notes: | The download link is provided below. |
Recordings: | Please note that recordings from past years will not be made available. |
Credits: | 4 ECTS credits |
Course structure: | The class will be taught in English. There will be a written exam in English of duration 180 minutes. |
News
We will post important announcements, links, and other information here in the course of the semester.
- The first lecture takes place on Tuesday, Sept. 16, 10:15-12:00, followed by the first exercise session 12:15-13:00.
Course Information
The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks.
- Universal approximation with single- and multi-layer networks
- Introduction to approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, non-linear approximation theory
- Fundamental limits of deep neural network learning
- Geometry of decision surfaces
- Separating capacity of nonlinear decision surfaces
- Vapnik-Chervonenkis (VC) dimension
- VC dimension of neural networks
- Generalization error in neural network learning
Prerequisites
The course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular.
Lecture notes
Problem sets and solutions
There will be several problem sets for this course, which will help you better understand the lectures and prepare you for the exam. All the problem sets will be discussed in the exercise session, and the solutions will be uploaded afterwards.
Problems | Solutions |
Set 1 | Solution 1 |
Set 2 | |
Set 3 | |
Set 4 | |
Set 5 | |
Set 6 | |
Set 7 | |
Set 8 | |
Set 9 | |
Set 10 | |
Set 11 | |
Set 12 | |
Set 13 | |
Set 14 |
Previous exams and solutions
Winter Exam 2020: | Problems | Solutions |
Summer Exam 2020: | Problems | Solutions |
Winter Exam 2021: | Problems | Solutions |
Summer Exam 2021: | Problems | Solutions |
Winter Exam 2022: | Problems | Solutions |
Summer Exam 2022: | Problems | Solutions |
Winter Exam 2024: | Problems | Solutions |