Spike deconvolution is the problem of recovering the point sources from their convolution with a known point spread function, which plays a fundamental role in many sensing and imaging applications. In this paper, we investigate the local geometry of recovering the parameters of point sources—including both amplitudes and locations—by minimizing a natural nonconvex least-squares loss function measuring the observation residuals. We propose preconditioned variants of gradient descent (GD), where the search direction is scaled via some carefully designed preconditioning matrices. We begin with a simple fixed preconditioner design, which adjusts the learning rates of the locations at a different scale from those of the amplitudes, and show it achieves a linear rate of convergence—in terms of entrywise errors—when initialized close to the ground truth, as long as the separation between the true spikes is sufficiently large. However, the convergence rate slows down significantly when the dynamic range of the source amplitudes is large. To bridge this issue, we introduce an adaptive preconditioner design, which compensates for the learning rates of different sources in an iteration-varying manner based on the current estimate. The adaptive design provably leads to an accelerated convergence rate that is independent of the dynamic range, highlighting the benefit of adaptive preconditioning in nonconvex spike deconvolution. Numerical experiments are provided to corroborate the theoretical findings.
Spike Deonvolution是从已知的点扩散功能中恢复其点源的问题,在本文中,我们在许多敏感性和成像应用中起着基本作用。放大器和位置 - 通过将天然的非covex最小二乘损失函数降至最低,我们提出了梯度下降(GD)的预处理变体(GD)。在搜索方向通过一些精心设计的预处理进行缩放,我们从简单的固定预处理设计开始,该设计以与放大器的尺度不同的规模调整了位置的学习率,并表明它达到了线性收敛速度 - 就进入误差而言,只要真正的峰值之间的分离足够大,就会接近地面真相。当我们介绍一个自适应的预处理设计时,当源头的动态范围很大。与动态范围无关的加速收敛速率,突出了在非convex Spike Deonvolution中的自适应预处理的好处。证实理论发现。