科学研究
学术报告
当前位置: 学院主页 > 科学研究 > 学术报告 > 正文

Decentralized optimization on compact submanifolds: Linear consensus and gradient-type methods

发布时间:2024-06-04 作者: 浏览次数:
Speaker: 户将 DateTime: 2024年6月6日(周四)上午9:00-10:00
Brief Introduction to Speaker:

户将,现为加州大学伯克利分校博士后。2020年博士毕业于北京大学,导师是文再文教授;2021年至2022年在香港中文大学做博士后研究;20223月至20245月在哈佛医学院和麻省总医院做博士后研究。他的主要研究兴趣包括光滑和非光滑优化,分布式优化和联邦学习,以及在机器学习和信号处理中的应用。目前在SIAM 系列、JMLRIEEE TSP等杂志发表多篇文章,出版教材《最优化:建模、算法与理论》,《最优化计算方法》。他的论文获得IEEE 信号处理旗舰会议ICASSP 2024 最佳论文奖。

Place: 腾讯会议(会议号:860837114)
Abstract:Due to its wide-ranging applications and growing concerns over privacy and robustness, decentralized manifold optimization has captured significant attention. In this talk, we consider the problem of decentralized nonconvex optimization over a compact submanifold, where each local agent's objective function defined by the local dataset is smooth. Firstly, by leveraging the powerful tool of proximal smoothness of the compact submanifold, we show the local linear convergence of the projected gradient descent method and Riemannian gradient method, both with a unit step size, for solving the consensus problem over the compact submanifolds, which plays a central role in designing and analyzing decentralized algorithms. Subsequently, we propose several decentralized gradient-type methods, including the decentralized projected Riemannian gradient descent (DPRGD), the decentralized projected Riemannian gradient tracking (DPRGT), and decentralized Douglas-Rachford splitting (DDRS) methods. W...