We consider the estimation of some parameter $\mathbf{x}$ living in a cone from the nonlinear observations of the form $\{y_i=f_i(\langle\mathbf{a}_i,\mathbf{x}\rangle)\}_{i=1}^m$. We develop a unified approach that first constructs a gradient from the data and then establishes the restricted approximate invertibility condition (RAIC), a condition that quantifies how well the gradient aligns with the ideal descent step. We show that RAIC yields linear convergence guarantees for the standard projected gradient descent algorithm, a Riemannian gradient descent algorithm for low Tucker-rank tensor estimation, and a factorized gradient descent algorithm for asymmetric low-rank matrix estimation. Under Gaussian designs, we establish sharp RAIC for the canonical statistical estimation problems of single index models, generalized linear models, noisy phase retrieval, and one-bit compressed sensing. Combining the convergence guarantees and the RAIC, we obtain a set of optimal statistical estimation results, including, to our knowledge, the first minimax-optimal and computationally efficient algorithms for tensor single index models, tensor logistic regression, (local) noisy tensor phase retrieval, and one-bit tensor sensing. Moreover, several other results are new or match the best known guarantees. We also provide simulations and a real-data experiment to illustrate the theoretical results.
翻译:暂无翻译