Abstract
We present a modified version of the perceptron learning algorithm (PLA) which solves semidefinite programs (SDPs) in polynomial time. The algorithm is based on the following three observations: (i) Semidefinite programs are linear programs with infinitely many (linear) constraints; (ii) every linear program can be solved by a sequence of constraint satisfaction problems with linear constraints; (iii) in general, the perceptron learning algorithm solves a constraint satisfaction problem with linear constraints in finitely many updates. Combining the PLA with a probabilistic rescaling algorithm (which, on average, increases the size of the feasible region) results in a probabilistic algorithm for solving SDPs that runs in polynomial time. We present preliminary results which demonstrate that the algorithm works, but is not competitive with state-of-the-art interior point methods.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 16 |
Editors | Sebastian Thrun, Lawrence Saul, Bernhard Scholkopf |
Publisher | MIT Press |
Pages | 457-464 |
Number of pages | 8 |
Publication status | Published (in print/issue) - 2004 |
Event | International Conference on Neural Information Processing Systems - Vancouver, Canada Duration: 9 Dec 2003 → 14 Dec 2003 Conference number: 16 |
Conference
Conference | International Conference on Neural Information Processing Systems |
---|---|
Abbreviated title | NIPS |
Country/Territory | Canada |
City | Vancouver |
Period | 9/12/03 → 14/12/03 |
Keywords
- semidefinite programming
- perceptron learning
- optimization
- probabilistic algorithm
- MAXCUT