Optimizing neural networks is a highly nonconvex problem, and even optimizing a 2-layer neural network can be challenging. In the recent years many different approaches were proposed to learn 2-layer neural networks under different assumptions. This talk will give a brief survey on these approaches, and discuss some new results using spectral methods and optimization landscape.