ProfOptimization2016

Seminar 2019.1

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Organized by Luis Román Lucambio Pérez
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------

The seminars in this semester will be held in the Lecture Room of IME/UFG, unless otherwise stated. All interested are very welcome to attend.

------------------------------------------------------------------------------------------------------------------------------------------

Date: March 21

Speaker: Prof. Reinier Diaz Millan, IFG

Title:  Comparing Averaged Relaxed Cutters and Projection Methods: Theory and Examples.

Abstract: We focus on the convergence analysis of averaged relaxations of cutters, specifically for variants that depending upon how parameters are chosen resemble alternating projections, the Douglas--Rachford method, relaxed reflect-reflect, or the Peaceman--Rachford method. Such methods are frequently used to solve convex feasibility problems. The standard convergence analysis of projection algorithms is based on the firm nonexpansivity property of the relevant operators. However if the projections onto the constraint sets are replaced by cutters (which may be thought of as maps that project onto separating hyperplanes), the firm nonexpansivity is lost. We provide a proof of convergence for a family of related averaged relaxed cutter methods under reasonable assumptions, relying on a simple geometric argument. This allows us to clarify fine details related to the allowable choice of the relaxation parameters, highlighting the distinction between the exact (firmly nonexpansive) and approximate (strongly quasi-nonexpansive) settings. We provide illustrative examples and discuss practical implementations of the method.

_______________________________________________________________________

Date: March 28

Speaker: Ademir Alves Aguiar (PhD student-IME/UFG)

Title: Subgradient method with feasible inexact projections for constrained convex optimization

Abstract: We will present a new method that combine $\epsilon$-subgradient method with a procedure to obtain a feasible inexact projection for finding a solution of constrained convex optimization problems.

__________________________________________________________________

Date: April 4

Speaker: Ray Victor Guimarães Serra (PhD student-IME/UFG)

Title: Full convergence of a proximal gradient method for convex composite multiobjective optimization.

Abstract: Consider a multiobjective optimization problem whose objective is splitted into two somands; one is continuously differentiable and convex and the other is convex, closed and proper, that is, it can be non-differentiable. We will present a proximal gradient method to solve it, we will show that the sequence generated by the algorithm converges to a weak Pareto point, and we will establish an iteration-complexity bound to obtain an approximate weak Pareto point of the aforementioned problem.

______________________________________________________________________

Date: April 11

Speaker: Tiago da Costa Menezes (PhD student-IME/UFG)

Title:  On the analysis of a Gauss-Newton method with approximate projections

Abstract:  In this talk we will discuss an algorithm based on the Gauss-Newton method with a non-monotone step search technique to solve constrained nonlinear least squares problem.  Convergence, results on the convergence's rate and the numerical performance of the method will also discussed.

_______________________________________________________________________

Date: April 18

Speaker: Pedro Bonfim de Asunção Filho (PhD student-IME/UFG)

Title: Simplified versions of the condicional gradient method

Abstract: In this talk will be presented the paper of I. V. Konnov "Simplified versions of the conditional gradient method" - arXiv preprint arXiv:1801.05251 - published in 2018.  This paper presents modifications of the conditional gradient method for smooth optimization problems. The modifications maintain the basic convergence properties of conditional gradient method with the advantage of reducing the implementation cost of each iteration.

________________________________________________________________________

Date: April 25

Speaker: Flávio Pinto Vieira (PhD student-IME/UFG)

Title: New step lengths in conjugate gradient methods

Abstract: We will present a propouse of Yuan Dong published as "New step lengths in conjugate gradient methods" in Computers and Mathematics with Applications 60(2010). The author sugessted new step lengths computations for conjugate gradient algorithms, using just gradient's information avoiding the use of information of the objetive function. Some convergence results were presented, in particular, under Lipschitz-continuity of the gradient and for the Fletcher-Reeves parameter.

________________________________________________________________________

Date:  May 9

Speaker:  Ademir Alves Aguiar (PhD student-IME/UFG)

Title: Inexactly projected gradient algorithm

Abstract: We will present a variant of the projected gradient method for constrained convex optimization problems using feasible inexact projections.

________________________________________________________________________

Date: May 16

Speaker: Prof. Leandro Prudente, IME/UFG

Title: On the extension of the Hager-Zhang conjugate gradient method for vector optimization

Abstract: The extension of the Hager-Zhang (HZ) nonlinear conjugate gradient method for vector optimization is discussed in this talk. In the scalar minimization case, this method generates descent directions whenever, for example, the line search satisfies the standard Wolfe conditions. We show first that, in general, the direct extension of the HZ method for vector optimization does not yield descent (in the vector sense) even when an exact line search is performed. By using a sufficiently accurate line search, we then propose a self-adjusting HZ method which possesses the descent property. The proposed HZ method with suitable parameters reduces to the classical one in the scalar minimization case. Global convergence of the new scheme is proved without regular restarts and any convex assumption. Finally, numerical experiments illustrating the practical behavior of the approach are presented, and comparisons with the Hestenes-Stiefel conjugate gradient method are discussed.

________________________________________________________________________

Date: May 30

Speaker: Fernando Lima (PhD student-IME/UFG)

Title: Nonmonotone line searches for unconstrained multiobjective optimization problems

Abstract: In this work, we consider nonmonotone line searches, i.e., we allow the increase of objective functions values in some iterations. Two types of nonmonotone line searches are considered here: the one that takes the maximum of recent functions values, and the one that takes their average. Under reasonable assumptions, we prove that every accumulation point of the sequence produced by the nonmonotone version of the steepest descent and Newton methods is Pareto critical.

________________________________________________________________________

Date: June 06

Speaker: Ruby Yohana Cuero Zuñiga (MsC student-IME/UFG)

Title: Generalized Convex Quadratic Functions.

Abstract: In this work, we will study a class of real symmetric matrices that we will call subdefinite, these matrices include the positive and negative semidefinite. For our purpose we will focus on the merely positive subdefinite matrices, that is, those matrices that are positive subdefinite but are not positive semidefinite. We will treat quadratic functions with nonnegative variables and show that these functions are quasiconvex not convex, when their matrix representation is given by a merely positive subdefinite matrix. In addition, we will present a result of great importance in quadratic programming given that it allows to reduce the quasiconvexity of these nonconvex quadratic functions to the pseudoconvexity in the semipositive orthant. Finally, we will study the conditional gradient method to solve the quadratic programming problem, where the objective function is of this type.

________________________________________________________________________

Date: June 13

Speaker: Tiago da Costa Menezes (PhD student-IME/UFG)

Title:  Nonmonotone spectral projected gradient methods on convex sets.

Abstract:  In this talk will be presented the paper " Birgin, Ernesto G., José Mario Martínez, and Marcos Raydan. “Nonmonotone spectral projected gradient methods on convex sets." SIAM Journal on Optimization 10.4 (2000): 1196-1211". Our main objective will be to present an extension of classical projected gradient. 

_______________________________________________________________________

Date: June 27

Speaker:  Ray Victor Guimarães Serra (PhD student-IME/UFG)

Title: An inertial proximal method for finding zeros of maximal monotone operators.

Abstract: In this talk, we will present the paper " Alvarez, F., Attouch, H.: An Inertial Proximal Method for Maximal Monotone Operators via Discretization of a Nonlinear Oscillator with Damping, Set-Valued Anal. 9(2001), no. 1-2, pp 3-11 ". In this paper, the asymptotic convergence of an implicit iterative method  for finding zeros of maximal monotone operators in a Hilbert space, which generalizes the classical proximal point method, is analyzed.

 

_______________________________________________________________________

Date: July 04

Speaker:  Ruby Yohana Cuero Zuñiga (PhD student-IME/UFG)

Title: Conditional gradient method and its application in quadratic programming.

Abstract:  In this talk we will present the conditional gradient method, study its convergence properties and use it to solve a quadratic programming problem whose objective function is pseudoconvex on the semipositive orthant.

_______________________________________________________________________

Date: July 11

Speaker: Pedro Bonfim de Assunção Filho (PhD student-IME/UFG)

Title: Conditional gradient method.

Abstract:  In this talk we present some new results about conditional gradient algorithm for minimizing a differentiable convex function with Lipschitz continuous gradient on compact convex set.