Basic power method convergence

Let γ=σ1σ2σ1\gamma = \frac{\sigma_1 - \sigma_2}{\sigma_1} be parameter capturing the “gap” between the first and second largest singular values. If Power method is initialized with a random Gaussian vector, then with high probability, after T=O(logd/ϵγ)T=O(\frac{\log d/\epsilon}{\gamma}) steps, we have either: 𝐯1𝐳(T)2ϵor𝐯1(𝐳(T))2ϵ\lVert \mathbf{v}_1 - \mathbf{z}^{(T)}\rVert_2 \leq \epsilon \qquad \textrm{or} \qquad \lVert \mathbf{v}_1 - (-\mathbf{z}^{(T)})\rVert_2 \leq \epsilon

The method won’t converge if γ\gamma is very small. Consider extreme case when γ=0\gamma = 0.

#incomplete