ProfOptimization2016

Seminar 2025-2

Organized by Glaydston Bento
------------------------------------------------------------------------------------------------------------------------------------------

The seminars will be held in the Lecture Room of IME/UFG. All interested are very welcome to attend.

------------------------------------------------------------------------------------------------------------------------------------------

Date: August 21

Speaker: Claudemir Rodrigues Santiago

Title: On the Convergence of Proximal Gradient Methods with Explicit Linesearch for Composite Optimization Problems

Abstract: In this talk, we will present results based on the work of Bello Cruz, J. Y., and Nghia, T. T. A., On the convergence of the forward–backward splitting method with linesearches, Optimization Methods and Software, vol. 31, no. 6, 2016. DOI: 10.1080/10556788.2016.1214959. Specifically, we will discuss convergence results for one of the methods proposed in the paper, designed to solve composite optimization problems where both component functions $f$ and $g$ are convex. The method incorporates explicit linesearch procedures to remove the commonly imposed Lipschitz continuity assumption on the gradient of $f$, thereby ensuring weak convergence of the generated sequences to optimal solutions.

------------------------------------------------------------------------------------------------------------------------------------------

Date: August 28

Speaker: Jurandir Lopes

Title: A Refined Proximal Algorithm for Nonconvex Multiobjective Optimization in Hilbert Spaces

Abstract: This paper is devoted to general nonconvex problems of multiobjective optimization in Hilbert spaces. Based on Mordukhovich’s limiting
subgradients, we define a new notion of Pareto critical points for such problems, establish necessary optimality conditions for them, and then employ these conditions to develop a refined version of the vectorial proximal point algorithm with providing its detailed convergence analysis. The obtained results largely extend those initiated by Bonnel, Iusem and Svaiter [4] for convex vector optimization problems and by Bento et al. [3] for nonconvex finite-dimensional problems in terms of Clarke’s generalized gradients. The obtained results largely extend those initiated by Bonnel et al. [SIAM J Optim, 15 (2005), pp. 953–970] for convex vector optimization problems, specifically in the case where the codomain is an m-dimensional space and by Bento et al. [SIAM J Optim, 28 (2018), pp. 1104-1120] for nonconvex finite-dimensional problems in terms of Clarke’s generalized gradients.

------------------------------------------------------------------------------------------------------------------------------------------

Date: September 04

Speaker: Orizon Ferreira

Title: On the Frank--Wolfe method for DC-optimization  with piecewise star-convex objectives

Abstract: We present a projection-free Frank–Wolfe method for difference-of-convex optimization with piecewise-star-convex objectives. The algorithm calls a linear minimization oracle, uses adaptive backtracking, preserves feasibility, and provides a standard Frank–Wolfe dual-gap certificate. Under mild assumptions, it matches the iteration complexity of the convex setting.

------------------------------------------------------------------------------------------------------------------------------------------

Date: September 25

Speaker: Max Leandro

Title: Sub-sampled Trust-Region Methods with Deterministic Worst-Case Complexity

Abstract: In this talk, we discuss and analyze sub-sampled trust-region methods for solving finite-sum optimization problems. These methods employ subsampling strategies to approximate the gradient and Hessian of the objective function, significantly reducing the overall computational cost. We propose a novel adaptive procedure for deterministically adjusting the sample size used for gradient (or gradient and Hessian) approximations. Furthermore, we establish worst-case iteration complexity bounds for obtaining approximate stationary points. More specifically, for a given $\varepsilon_g, \varepsilon_H\in (0,1)$, it is shown that an $\varepsilon_g$-approximate first-order stationary point is reached in at most $\mathcal{O}({\varepsilon_g}^{-2} )$ iterations, whereas an $(\varepsilon_g,\varepsilon_H)$-approximate second-order stationary point is reached in at most $\mathcal{O}(\max\{\varepsilon_{g}^{-2}\varepsilon_{H}^{-1},\varepsilon_{H}^{-3}\})$ iterations. Finally, numerical experiments illustrate the effectiveness of our new subsampling technique.

------------------------------------------------------------------------------------------------------------------------------------------

Date: October 02

Speaker: Maurício Louzeiro

Title: A Riemannian AdaGrad-Norm Method

Abstract: We propose a manifold AdaGrad-Norm method (\textsc{MAdaGrad}), which extends the norm version of AdaGrad (AdaGrad-Norm) to Riemannian optimization. In contrast to line-search schemes, which may require several exponential map computations per iteration, \textsc{MAdaGrad} requires only one. Assuming the objective function $f$ has Lipschitz continuous Riemannian gradient, we show that the method requires at most $\mathcal{O}(\varepsilon^{-2})$ iterations to compute a point $x$ such that $\|\operatorname{grad} f(x)\|\leq \varepsilon$. Under the additional assumptions that $f$ is geodesically convex and the manifold has sectional curvature bounded from below, we show that the method takes at most $\mathcal{O}(\varepsilon^{-1})$ to find $x$ such that $f(x)-f_{low}\leq\epsilon$, where $f_{low}$ is the optimal value. Moreover, if $f$ satisfies the Polyak--\L{}ojasiewicz condition globally on the manifold, we establish a complexity bound of $\mathcal{O}(\log(\varepsilon^{-1}))$, provided that the norm of the initial Riemannian gradient is sufficiently large. For the manifold of symmetric positive definite matrices, we construct a family of nonconvex functions satisfying the PL condition. Numerical experiments illustrate the remarkable performance of \textsc{MAdaGrad} in comparison with Riemannian Steepest Descent equipped with Armijo line-search.

------------------------------------------------------------------------------------------------------------------------------------------

Date: October 16

Speaker: Jurandir Lopes/Glaydston Bento

Title: Necessary conditions for Pareto optimality in nonconvex Multiobjective Programs

Abstract: This paper focuses on general nonconvex multiobjective optimization problems in finite-dimensional spaces and, utilizing limiting/Mordukhovich subgradients as a basis, we establish necessary optimality conditions for them.
Our main result is a version of the conditions of Fritz–John type for weak Pareto optimal solutions proposed by Minami in Journal of Optimization Theory and Applications, 41 (1983), pp. 451–461, now applied to non-convex functions that are not necessarily locally Lipschitzian, namely, directionally Lipschitzian functions.

------------------------------------------------------------------------------------------------------------------------------------------

Date: October 23

Speaker: Alejandra Muñoz González

Title: Otimização Riemanniana via Métodos de Frank-Wolfe (RFW)

Abstract:

Neste seminário, apresentamos uma análise detalhada da otimização Riemanniana via Métodos de Frank-Wolfe (RFW). 
O objetivo é estender o clássico algoritmo de Frank-Wolfe para o contexto de variedades Riemannianas, permitindo
resolver problemas de otimização com restrições sem a necessidade de projeções custosas. Discutimos a formulação
teórica do problema de otimização em variedades, as propriedades de suavidade e convexidade geodésica,e mostramos
como o método de Frank-Wolfe pode ser adaptado para este cenário. Apresentamos também uma aplicação importante na
variedade das matrizes definidas positivas (HPD).
Além da formulação teórica, mostramos que o algoritmo proposto possui garantias de convergência não assintótica,
atingindo taxa $O(1/k)$ para o caso g-convexo e $O(1/\sqrt{k})$ para o caso não convexo.
------------------------------------------------------------------------------------------------------------------------------------------

Date: October 30

Speaker: Layane Rodrigues

Title: 

Abstract: 

------------------------------------------------------------------------------------------------------------------------------------------

Date: November 13

Speaker: Jose Roberto Ribeiro Junior

Title: 

Abstract: 

------------------------------------------------------------------------------------------------------------------------------------------

Date:November 27

Speaker: Vilmar Gehlen Filho

Title: 

Abstract: 

------------------------------------------------------------------------------------------------------------------------------------------

Date: December 04

Speaker: Iago Victor Pires de Souza Nunes

Title: 

Abstract: