Johannes Schmidt-Hieber - Mathematics for Deep Neural Networks: Statistical theory for deep ReLU networks (Lecture 4/5)
From Katie Gentilello on March 28th, 2019
We outline the theory underlying the recent bounds on the estimation risk of deep ReLU networks. In the lecture, we discuss specific properties of the ReLU activation function that relate to skipping connections and efficient approximation of polynomials. Based on this, we show how risk bounds can be obtained for sparsely connected networks.