We consider the challenge of sampling from a target probability distribution and calculating expectations when only stochastic (noisy) estimates of the potential function's gradient are available, rather than the true gradient. We analyze the accuracy, measured by Mean Square Error (MSE), of time averages obtained using the Stochastic Gradient UBU (SG-UBU) algorithm, which is based on underdamped Langevin dynamics. A novel discrete Poisson equation framework is proposed to dissect the sampling error into bias and variance components. Using this framework, we establish bounds on the MSE for SG-UBU, quantifying its dependence on algorithm parameters like step size, iteration count, problem dimension, potential landscape characteristics, and noise level. The analysis confirms the long-term stability of the numerical solutions generated by SG-UBU. Furthermore, we provide an explicit characterization of the numerical bias introduced by the algorithm, showing it decreases linearly with the step size. The dominant term in this bias is directly related to the variance of the stochastic gradient estimate. These findings are extended to the common mini-batch setting, examining the implications for computational cost and efficiency, especially in parallel computing environments. Numerical results confirm the theoretical analysis of the bias behavior.
周珍楠,西湖大学数学副教授。2014 年在美国威斯康辛大学麦迪逊分校获得博士学位,2014-2017 年在美国杜克大学担任助理研究教授,2017 年加入北京大学北京国际数学研究中心担任助理教授,2024年加入西湖大学理论科学研究院和交叉科学中心。主要研究领域为微分方程的应用分析,微分方程数值解,应用随机分析,随机模拟等,特别是关注来源于自然科学的应用数学问题。入选国家级人才项目,目前担任Studies in Applied Mathematics,Communications in Mathematical Sciences,Mathematical Medicine and Biology: A Journal of the IMA等国际著名期刊的编委。