Seminar 2021.2
Seminar 2021.2
The seminars will be held remotely on Thursday at 8:00 am via Google Meet, unless otherwise stated.
All interested are very welcome. To attend, please contact Prof. Orizon P. Ferreira by e-mail orizon@ufg
------------------------------------------------------------------------------------------------------------------------
DATA: 09/12/2021
Title: The DC Algorithm on Hadamard manifolds
Speaker: Elianderson M. Santos (Estudante de doutorado-IME/UFG)
Abstract: The DC Algorithm (DCA), proposed originally by Tao and Souad in [Algorithms for solving a class of nonconvex optimization problems. Methods of subgradients. North-Holland Mathematics Studies, vol.129, pp. 249-271.North-Holland, Amsterdam (1986)], was the first method devoted to solving DC problems, that is, problems that require minimizing an objective function defined as a difference of two convex functions. Over last 35 years, several versions of the DCA for solving practical problems that can be modeled as DC have been developed and, in addition, several other methods for DC programming have emerged from the DCA ideas, e.g. the Proximal DCA (PDCA), and the Boosted DCA (BDCA). In this seminar we present an extension of the DCA to the context of DC problems on Hadamard manifolds. We show that the method is well defined and, additionally, that every cluster point of the sequence generated by the DCA is a critical point of the considered problem, which retrieves a property of the original DCA in the Euclidean space.
------------------------------------------------------------------------------------------------------------------------
DATA: 16/12/2021
Não haverá seminário nesta data.
------------------------------------------------------------------------------------------------------------------------
DATA: 23/12
Título: Sobre o método BFGS para otimização irrestrita multiobjetivo
Speaker: Danilo Rodrigues de Souza (Estudante de doutorado-IME/UFG)
Resumo: Nesta apresentação, propomos uma extensão do método BFGS para otimização multiobjetivo que, assim como o clássico, usa busca linear de Wolfe para obter os comprimentos de passo. Semelhante ao caso escalar, se os comprimentos de passos satisfazem as condições de Wolfe multiobjetivo, então a atualização proposta se mantém definida positiva. A boa definição do método é obtida mesmo quando as funções objetivo são não-convexas. Usando hipóteses de convexidade, estabelecemos convergência superlinear para pontos Pareto ótimos. As hipóteses consideradas são extensões diretas das usadas no método BFGS escalar. Discutiremos também estratégias de globalização do método BFGS multiobjetivo para o contexto não-convexo.
------------------------------------------------------------------------------------------------------------------------
DATA: 20/01/2022 (Este seminário será no formato virtual, para assisti-lo siga o link meet.google.com/kyn-gxcm-ftj)
Título: Computing inexact K-steepest descent directions and a new line search for Vector Optimization
Speaker: Flavio P. Vieira (Estudante de doutorado-IME/UFG)
Abstract: In this talk, two results are presented. In 2004, Iusem and Graña Drummond introduced the concept of $\sigma-$approximate $\mathcal{K}-$steepest descent direction. They showed that by replacing the Cauchy direction with these directions, the convergence result of the generated sequence is the same: every accumulation point is critical. We will present an efficient procedure for computing these directions when the cone $\mathcal{K}$ is finitely generated. Yunda Dong, in 2010 and 2012, introduced a new linear search procedure for Conjugated Gradient methods using only first-order information, i.e., without working with functional values. We extended his works to Vector Optimization. We studied conjugate gradient methods, showing convergence, when the following $\beta_k'$s are used: Fletcher-Reeves, conjugate descent, Dai-Yuan, Polak-Ribière-Polyak, and Hestenes-Stiefel.
------------------------------------------------------------------------------------------------------------------------
Title: A Semilocal Convergence of a Secant–Type Method for Solving Generalized Equations
Speaker: Paulo César da Silva Júnior (Estudante de doutorado-IME/UFG)
Abstract: In this seminar will be presented the paper "A Semilocal Convergence of a Secant–Type Method for Solving Generalized Equations (https://link.springer.com/article/10.1007/s11117-006-0044-3)".
"Abstract: In this paper we present a study of the existence and the convergence of a secant–type method for solving abstract generalized equations in Banach spaces. With different assumptions for divided differences, we obtain a procedure that have superlinear convergence. This study follows the recent results of semilocal convergence related to the resolution of nonlinear equations (see https://www.sciencedirect.com/science/article/pii/S0898122102001475)"
We first study properties of the intrinsic distance, for instance, we
present the spectral decomposition of its Hessian. Next, we study the
concept of convex sets and the intrinsic projection onto these sets. We also
study the concept of convex functions and present first and second order
characterizations of these functions, as well as some optimization concepts
related to them. An extensive study of the hyperbolically convex quadratic
functions is also presented.
DATA: 17/03/2022
Título: Desempenho numérico do método BFGS para Otimização Multiobjetivo
Palestrante: Danilo Rodrigues de Souza (Estudante de doutorado-IME/UFG)
Resumo: Nesse seminário faremos uma breve apresentação do método BFGS para otimização multiobjetivo, proposto por L. F. Prudente e D. R. Souza, e os testes numéricos realizados. O método proposto tem como principal novidade o uso de passo de Wolfe com a atualização da Hessiana aproximada em todas as iterações. Para fins de comparação, utilizamos um método BFGS multiobjetivo onde o método dá passo satisfazendo Armijo e, se uma cautelosa condição é satisfeita, atualiza a Hessiana aproximada. Para esse mesmo método, também propomos o uso de passo de Wolfe, obtendo assim um terceiro método. Os testes numéricos foram realizados com esses três métodos e os resultados obtidos mostram que o uso do passo de Wolfe com a atualização da Hessiana aproximada em todas as iterações, cria uma potencial vantagem em relação aos outros dois.
------------------------------------------------------------------------------------------------------------------------
DATA: 24/03/2022 (This seminar will be exceptionally at 10:00 am)
Título: Computing inexact K-steepest descent directions and a new line search procedure for Vector Optimization
Palestrante: Flavio P. Vieira (Estudante de doutorado-IME/UFG)
Resumo:In this work, we proposes a new linear search and a way for the computation of sigma-approximate direction. Yunda Dong, in 2010 and 2012, introduced a new linear search procedure for Conjugated Gradient methods using only first-order information, i.e., without working with functional values. We extended his works to Vector Optimization. We studied conjugate gradient methods, showing convergence when the following beta_k's are used: Fletcher-Reeves, conjugate descent, Dai-Yuan, Polak-Ribiére-Polyak, and Hestenes-Stiefel. We also use this line search in the gradient method, showing its convergence. In 2004, Iusem and Graña Drummond introduced the concept of sigma-approximate K-steepest descent direction. They showed that by replacing the Cauchy direction with these directions, the convergence result of the generated sequence is the same: every accumulation point is critical. We will present an efficient procedure for computing these directions when the cone K is finitely generated.
------------------------------------------------------------------------------------------------------------------------
DATA: 31/03/2022
Title: On the computation of approximated steepest descent direction in vector optimization
Speaker: Prof. Luis Roman Lucâmbio Perez (IME-UFG)
Abstract: When classical optimization procedures are extended to the context of vector optimization, one of the main issues is how to compute, efficiently, the direction along which the line search will be performed. This talk will be on this theme. Approximated steepest descent directions can be computed via an inclusion convex problem, which can be solved by minimizing a quadratic convex function over a simplex. The proposed procedure has finite termination.
--------------------------------------------------------------------------------------------------------------------------------------------
DATA: 07/04/2022
Title: Cancelado
Speaker: Leandro da Fonseca Prudente -IME/UFG
Abstract:
------------------------------------------------------------------------------------------------------------------------
DATA: 14/04/2022
Title: Adiado, será o primeiro seminário do próximo semestre
Speaker: Paulo César da Silva Júnior (Estudante de doutorado-IME/UFG)
Abstract:
------------------------------------------------------------------------------------------------------------------------
9/12 Elianderson
16/12 Não haverá seminário nesta data.
23/12 Danilo
Ano de 2022 (Seminário preferencialmente no formato presencial)
20/01 Flavio
27/01 Paulo
03/02 Fernando (formato virtual)
10/02 Glaydston
17/02 Jeffeson
24/02 Fabiano Boaventura de Miranda -UEG-Anápolis
10/03 Ademir
17/03 Danilo
24/03 Flavio
31/03 Luís
07/04 Leandro
14/04 Paulo