HOME    »    SCIENTIFIC RESOURCES    »    Volumes
Abstracts and Talk Materials
Algebraic Algorithms in Optimization
January 12-13, 2007


Hartwig Bosse - Center for Mathematics and Computer Science (CWI)

Demonstration of software
January 13, 2007 10:20 am - 11:10 am


Jesus De Loera - University of California, Davis
http://www.math.ucdavis.edu/~deloera/

Generating functions for integer optimization (part I)
January 13, 2007 11:20 am - 12:10 pm

Barvinok's algorithm for counting
  1. Original approach
  2. Homogenized algorithm
  3. Integer linear programming in fixed dimension

 

Generating functions for integer optimization (part I)-continued
January 13, 2007 1:30 pm - 2:20 pm


 

Generating functions for integer optimization (part II)
January 13, 2007 3:00 pm - 3:50 pm

Barvinok's algorithm for counting
  1. Generating functions and non-linear integer optimization
  2. Other Rational Functions (Vergne, Lasserre,Nesterov)

 

Demo and introduction to LattE
January 13, 2007 4:00 pm - 5:00 pm


Jean Lasserre - Centre National de la Recherche Scientifique (CNRS)
http://www.laas.fr/~lasserre/

Optimization over polynomials with moment matrices and sums of squares

Remarks: (Due to visa problems Monique Laurent was not be able to attend the tutorial. Jean Bernard Lasserre substituted for Monique Laurent.)


January 12, 2007 2:40 pm - 3:30 pm

We consider the problem (P) of minimizing a polynomial function over a semialgebraic set defined by polynomial inequalities and equations. This is a hard problem, which includes well known NP-hard instances such as 0/1 linear programming, the partition problem for integer sequences, or matrix copositivity.

While it is hard to test whether a polynomial is nonnegative, one can test efficiently whether it can be written as a sum of squares of polynomials using semidefinite programming. Based on this paradigm, one can formulate tractable semidefinite programming relaxations for (P) by replacing the hard `nonnegativity' condition by the tractable 'sum of squares' condition. The corresponding dual semidefinite programs involve positive semidefinite moment matrices, which reflects the classical duality theory between positive polynomials and moment sequences.

The objective of this tutorial is to present in detail the main properties of these semidefinite programming relaxations: asymptotic/finite convergence, optimality certificate, and extraction of global optimum solutions for (P), and to review the underlying mathematical tools: representation theorems for positive polynomials from real algebraic geometry, results about the truncated moment problem, and the algebraic eigenvalue method for solving systems of polynomial equations. These characteristic features are implemented in GloptiPoly, a solver for polynomial optimization developped by Henrion and Lasserre, and will be demonstrated on examples. Additional topics that may be covered if time allows include: various algebraic approaches to unconstrained polynomial minimization, link to combinatorial methods for 0/1 polynomial optimization, techniques for exploiting symmetry, sparsity, etc.

 

Optimization over polynomials with moment matrices and sums of squares(continued)
January 13, 2007 9:00 am - 9:50 am


Edwin O'Shea - University of Kentucky

Gröebner basis methods in integer programming (Lecture Part I)
January 12, 2007 8:40 am - 9:30 am

Part I
  1. Linear programming and triangulations.
  2. The integer program and toric ideals.
  3. Test sets, Gröebner bases and initial ideals.

 

Hands on exercises assisted by Tristram Bogart (Tutorial Part I)
January 12, 2007 10:00 am - 10:50 am


 

Gröebner basis methods in integer programming (Lecture Part II)
January 12, 2007 11:00 am - 11:50 am

Part II
  1. The Gröebner fan.
  2. Total dual integrality.
  3. Group relaxations.

 

Hands on exercises assisted by Tristram Bogart (Tutorial Part II)
January 12, 2007 1:30 pm - 2:20 pm


Go