Parametric Polynomial Time Perceptron Rescaling Algorithm

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Let us consider a linear feasibility problem with a possibly infinite number of inequality constraints posed in an on-line setting: an algorithm suggests a candidate solution, and the oracle either confirms its feasibility, or outputs a violated constraint vector. This model can be solved by subgradient optimisation algorithms for non-smooth functions, also known as the perceptron algorithms in the machine learning community, and its solvability depends on the problem dimension and the radius of the constraint set. The classical perceptron algorithm may have an exponential complexity in the worst case when the radius is infinitesimal [1]. To overcome this difficulty, the space dilation technique was exploited in the ellipsoid algorithm to make its running time polynomial [3]. A special case of the space dilation, the rescaling procedure is utilised in the perceptron rescaling algorithm [2] with a probabilistic approach to choosing the direction of dilation. A parametric version of the perceptron rescaling algorithm is the focus of this work. It is demonstrated that some fixed parameters of the latter algorithm (the initial estimate of the radius and the relaxation parameter) may be modified and adapted for particular problems. The generalised theoretical framework allows to determine convergence of the algorithm with any chosen set of values of these parameters, and suggests a potential way of decreasing the complexity of the algorithm which remains the subject of current research.
Original languageUndefined
Title of host publicationAlgorithms and Complexity in Durham 2006: Proceedings of the Second ACiD Workshop
EditorsHajo Broersma, Stefan Dantchev, Matthew Johnson, Stefan Szeider
Number of pages1
Publication statusPublished (in print/issue) - 2006

Bibliographical note

18-20 September 2006 Commentary On: Texts in Algorithmics 7


  • Linear programming
  • perceptron algorithm
  • subgradient descent method
  • online learning
  • oracle algorithm
  • space dilation.

Cite this