The problem of detecting and removing redundant constraints is fundamental in optimization. We focus on the case of linear programs (LPs) in normal form, given by \(d\) variables with \(n\) inequality constraints. A constraint is called redundant, if after removing it, the LP still has the same feasible region. The currently fastest method to detect all redundancies is the one by Clarkson: it solves \(n\) linear programs, but each of them has at most \(s\) constraints, where \(s\) is the number of nonredundant constraints.
In this talk, we study the special case where every constraint has at most two variables with nonzero coefficients. This family, denoted by LI(2), has some nice properties. Namely, given a variable \(x\) and a value \(c\), we can test in time \(O(nd)\) whether there is a feasible solution with \(x_i = c\).
Hochbaum and Naor present an \(O(nd^2 \log n)\) algorithm for solving feasibility (and finding a solution) in LI(2). Their technique makes use of the Fourier-Motzkin elimination method and the aforementioned result by Aspvall and Shiloach.
We present a strongly polynomial algorithm that solves redundancy detection in time \(O(nd^2 s \log s)\). It uses a modification of Clarkson's algorithm, together with a revised version of Hochbaum and Naor's technique.