Efficient Algorithms for Learning One Convolutional Layer - Adam Klivans
From Scott Jacobson on October 22nd, 2018
Our algorithm– Convotron — is inspired by recent work applying isotonic regression to learning neural networks. Convotron uses a simple, iterative update rule that is stochastic in nature and tolerant to noise. In contrast to gradient descent, Convotron requires no special initialization or learning-rate tuning to converge to the global optimum.