Abstract:
The shape-constrained problems in statistics have attracted much attention in recent decades. One of them is the task of finding the best fitting monotone regression. The problem of constructing monotone regression (also called isotonic regression) is to find best fitted non-decreasing vector to a given vector. Convex regression is the extension of monotone regression to the case of $2$-monotonicity (or convexity). Both isotone and convex regression have applications in many fields, including the non-parametric mathematical statistics and the empirical data smoothing. The paper proposes an iterative algorithm for constructing a sparse convex regression, i.e. for finding a convex vector $z\in \mathbb{R}^n$ with the lowest square error of approximation to a given vector $y\in \mathbb{R}^n$ (not necessarily convex). The problem can be rewritten in the form of a convex programming problem with linear constraints. Using the Karush–Kuhn–Tucker optimality conditions it is proved that optimal points should lie on a piecewise linear function. It is proved that the proposed dual active-set algorithm for convex regression has polynomial complexity and obtains the optimal solution (the Karush–Kuhn–Tucker conditions are fulfilled).
Keywords:dual active set algorithm, pool-adjacent-violators algorithm, isotonic regression, monotone regression, convex regression.