Prof. Hong-Kun Xu: Proximal Methods for Convex and Nonconvex Optimization

Dear Optimisation group members,

You are cordially invited to attend a talk by Prof. Hong-Kun Xu at RMIT next Monday 14 August at 3:30pm. Please note the unusual time. This is the same talk that was cancelled last Monday.

Speaker: Prof. Hong-Kun Xu
Hangzhou Dianzi University

Title: Proximal Methods for Convex and Nonconvex Optimization

Date and time: Monday 14 August 2017, 3:30–4:30pm – Note unusual day and time
Location: Building 8 Level 9 Room 66 (AGR) RMIT City campus

Abstract: The proximal operators are introduced by Moreau (1962) to generalize projections in Hilbert spaces.
They were recently found quite powerful in solving optimization problems which arise from image recovery/reconstruction, signal processing, machine learning, and compressed sensing. A common feature is that the objective function can be written as the sum of two (or more) convex (or even nonconvex) functions, one of which plays the role of regularization.

In this talk, we will first focus on the convex case where we will prove the convergence of the prox-grad algorithm, its estimate of sublinear convergence rate, and its Nesterov’s acceleration. In the second part of this talk, we extend part of the convex results to the case of nonconvex optimization.

Bio: Prof Hong Kun Xu is currently distinguished professor at Hangzhou Dianzi University. In 2014 he was selected by the Zhejiang “1000 Talents” program. He has addressed many international conferences as invited and keynote speaker. Xu is a winner of several awards, including the 2004 South African Mathematical Society Research Distinction. He was elected fellow to the Academy of Science of South Africa in 2005 and to TWAS, the World Academy of Sciences, in 2012. He has been Thomson Reuters Highly Cited Researcher since 2013.

Download Poster
Download slides

Skip to toolbar