Projected Gradient Descent Algorithms for Solving Nonlinear Inverse Problems with Generative Priors

Projected Gradient Descent Algorithms for Solving Nonlinear Inverse Problems with Generative Priors

Zhaoqiang Liu, Jun Han

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 3271-3277. https://doi.org/10.24963/ijcai.2022/454

In this paper, we propose projected gradient descent (PGD) algorithms for signal estimation from noisy nonlinear measurements. We assume that the unknown signal lies near the range of a Lipschitz continuous generative model with bounded inputs. In particular, we consider two cases when the nonlinear link function is either unknown or known. For unknown nonlinearity, we make the assumption of sub-Gaussian observations and propose a linear least-squares estimator. We show that when there is no representation error, the sensing vectors are Gaussian, and the number of samples is sufficiently large, with high probability, a PGD algorithm converges linearly to a point achieving the optimal statistical rate using arbitrary initialization. For known nonlinearity, we assume monotonicity, and make much weaker assumptions on the sensing vectors and allow for representation error. We propose a nonlinear least-squares estimator that is guaranteed to enjoy an optimal statistical rate. A corresponding PGD algorithm is provided and is shown to also converge linearly to the estimator using arbitrary initialization. In addition, we present experimental results on image datasets to demonstrate the performance of our PGD algorithms.
Keywords:
Machine Learning: Unsupervised Learning
Machine Learning: Learning Theory
Machine Learning: Probabilistic Machine Learning
Machine Learning: Theory of Deep Learning