Skip to content

Research Interests:

          • Applied analysis and partial differential equations
          • Probability and random dynamical systems
          • Tensors and applied algebraic geometry
          • Optimization theory and algorithms
          • Mathematics of machine learning

My google scholar profile

Preprints:

          • Eli Chien, Haoyu Wang, Ziang Chen, and Pan Li, Stochastic gradient Langevin unlearning. [ArXiv]
          • Ziang Chen, Jianfeng Lu, Yulong Lu, and Xiangxiong Zhang, Fully discretized Sobolev gradient flow for the Gross-Pitaevskii eigenvalue problem. [ArVix]
          • Ziang Chen and Rong Ge, Mean-field analysis for learning subspace-sparse polynomials with Gaussian input. [ArXiv]
          • Ziang Chen, Jialin Liu, Xiaohan Chen, Xinshang Wang, and Wotao Yin, Rethinking the capacity of graph neural networks for branching strategy. [ArXiv]
          • Lisang Ding, Ziang Chen, Xinshang Wang, and Wotao Yin, Efficient algorithms for sum-of-minimum optimization. [ArXiv]
          • Eli Chien, Haoyu Wang, Ziang Chen, and Pan Li, Langevin unlearning: a new perspective of noisy gradient descent for machine unlearning. [ArXiv] Short version accepted by ICLR 2024 PML Workshop (spotlight).
          • Ziang Chen and Jianfeng Lu, Exact and efficient representation of totally anti-symmetric functions. [ArXiv]

Refereed Journal Papers:

          • Ziang Chen, Jianfeng Lu, and Anru R. Zhang, One-dimensional tensor network recovery, accepted by SIAM J. Matrix Anal. Appl. [ArXiv]
          • Ziang Chen, Jianfeng Lu, Yulong Lu, and Xiangxiong Zhang, On the convergence of Sobolev gradient flow for the Gross-Pitaevskii eigenvalue problem, SIAM J. Numer. Anal., 62(2), 667-691 (2024). [Journal] [ArXiv]
          • Chongyao Chen, Ziang Chen, and Jianfeng Lu, Representation theorem for multivariable totally symmetric functions, accepted by Commun. Math. Sci. [ArXiv]
          • Ziang Chen, Yingzhou Li, and Jianfeng Lu, On the global convergence of randomized coordinate gradient descent for nonconvex optimization, SIAM J. Optim., 33(2), 713-738 (2023). [Journal] [ArXiv]
          • Ziang Chen, Jianfeng Lu, Yulong Lu, and Shengxuan Zhou, A regularity theory for static Schr\”odinger equations on $\mathbb{R}^d$ in spectral Barron spaces, SIAM J. Math. Anal., 55(1), 557-570 (2023). [Journal] [ArXiv]
          • Ziang Chen, Andre Milzarek, and Zaiwen Wen, A trust-region method for nonsmooth nonconvex optimization, J. Comp. Math., 41(4), 683-716 (2023). [Journal] [ArXiv]
          • Ziang Chen, Yingzhou Li, and Jianfeng Lu, Tensor ring decomposition: optimization landscape and one-loop convergence of alternating least squares, SIAM J. Matrix Anal. Appl., 41(3), 1416-1442 (2020). [Journal] [ArXiv]

Refereed Conference Papers:

          • Ziang Chen, Jialin Liu, Xinshang Wang, Jianfeng Lu, and Wotao Yin, On representing mixed-integer linear programs by graph neural networks, ICLR 2023. [Proceedings] [ArXiv]
          • Ziang Chen, Jialin Liu, Xinshang Wang, Jianfeng Lu, and Wotao Yin, On representing linear programs by graph neural networks, ICLR 2023 (spotlight). [Proceedings] [ArXiv]
          • Ziang Chen, Jianfeng Lu, Huajie Qian, Xinshang Wang, and Wotao Yin, HeteRSGD: tackling heterogeneous sampling costs via optimal reweighted stochastic gradient descent, AISTATS 2023. [Proceedings]
          • Ziang Chen, Jianfeng Lu, and Yulong Lu, On the representation of solutions to elliptic PDEs in Barron spaces, NeurIPS 2021 (spotlight). [Proceedings] [ArXiv]

Ph.D. Dissertation:

          • Mathematical analysis of high-dimensional algorithms and models, Duke University, 2023. [DukeSpace] [ProQuest]