Polynomial optimization considers optimization problems defined by polynomials. In contrast to classical nonlinearoptimization, it aims at finding global optimizers. Tensors are natural higher-order generalizations of matrices and are closely related to polynomials and moments. They are powerful tools in studying tensors. Many tensor problems can be formulated as polynomial optimization problems.
We propose a complete semidefinite relaxation algorithmfor detecting the copositivity of a symmetric tensor.
We show that the detection can be done by
solving a finite number of semidefinite relaxations for all tensors.
For the saddle point problem of polynomials,we give an algorithm for computing saddle points.
We show that: i) if there exists a saddle point,
our algorithm can get one by solving a finite number of
Lasserre type semidefinite relaxations;
ii) if there is no saddle point,
our algorithm can detect its nonexistence.
Hermitian tensors are generalizations of Hermitian matrices, but they have very different properties. Canonical basis Hermitian tensors, real Hermitian tensors, special matrix flattenings, positive semidefiniteness, and separability are studied. We further study how to detect separability of Hermitian tensors.
We formulate this as a truncated moment problem
and then provide a semidefinite relaxation algorithm to solve it.
The problem of learning diagonal Gaussian mixture models can be formulated as computing incomplete symmetric tensor decompositions.
We use generating polynomials
to compute incomplete symmetric tensor decompositions and approximations.
Then the tensor approximation is used to learn diagonal Gaussian mixture models.
When the first and third order moments are sufficiently accurate,
we show that the obtained parameters
for the Gaussian mixture models are also highly accurate.