Goto

Collaborating Authors

 aroraetal



Efficient Convex Relaxations for Streaming PCA

Raman Arora, Teodor Vanislavov Marinov

Neural Information Processing Systems

Theorem 4.2.Thefollowingholdsfor Algorithm 2: withprobabilityatleast1 , forallt T hP Pt,Ci 32 log ( 3e / ) ( C)2 t+ 1 1 , where = (C) Theempirical implementation condition allowsusCt, with specified components, 7 1: Experimentsonsyntheticdata.



invariantton 1 5 10 15 20 101 105 109 1013 1017

Neural Information Processing Systems

The rectified linear unit (ReLU) [Fukushima, 1980, Nair and Hinton, 2010] activation has been by far the most widely used nonlinearity and successful building block in deep neural networks (DNNs).


1b9812b99fe2672af746cefda86be5f9-Paper.pdf

Neural Information Processing Systems

Amotivating question behind the results inthis paper isto understand the hierarchyoffunction classes exactly represented by neural networks ofincreasing depth.