Convex Programming in Modern Fortran

Following the short discussion here about linear programming, i.e. solving convex optimization problems of the form

\begin{aligned} \mathrm{maximize} & \quad c^T x \\ \mathrm{subject~to} & \quad Ax = b \\ & \quad x \geq 0, \end{aligned}

and realizing that I could actually use such a solver for one of my research problems, I’ve decided to bite the bullet and started LightConvex, a lightweight package written in Modern Fortran for convex programming. While I have already modernized QuadProg for dense strictly convex quadratic programming, I’d like LightConvex to be more general (i.e. not just QP, not just dense problems, and not just strictly convex problems).

There are fantastic packages available in other languages. The ones I’m most familiar with are those from the cvx family, e.g. cvxpy in Python or Convex.jl in Julia. Both of these provide a complete domain specific language for disciplined convex programming enabling the user to define the problem in a language very close to the actual mathematical one. It is then pre-process behind the scene to put it in a standard conic form which can then be passed to various high-quality convex optimizers (e.g. Clarabel.jl or SCS to name only the open-source ones). As much as I’d like to see an equivalent in Fortran, my goal (for now) with LightConvex is not to replicate these features. Instead, I would like to focus on the most common problems (i.e. linear and quadratic programs) and provide a descent selection of high-quality (and ideally high-performance) specialized solvers written in pure modern Fortran. In that regard, LightConvex would be closer to HiGHS which is now used behind the scene in scipy to actually solve linear programs.

I’ve actually just started working on this new package and so it currently provides only the strict minimum (i.e. the standard simplex method for dense linear programs). I figured it was however better to announce it sooner than later to probe the community regarding how useful such a package would be, get some suggestions and feedback as early as possible to make it as easy to use as possible, as well as see if other convex enthusiasts with the know-how would be interested in helping the development of this package. As I’ve said, I’ve just started working on this project and what’s in the repo is hardly the embryo of a prototype with only the standard simplex method for dense LP being implemented. Here however is a tentative list of features I’d like to see included before officially releasing a first version of this prototype:

  • Support for dense LP
    • Standard primal simplex algorithm
    • Standard dual simplex algorithm
    • Revised dual simplex algorithm
  • Support for sparse LP
    • Primal simplex algorithm
    • Revised dual simplex algorithm
    • Primal Affine Scaling
  • High-level interfaces
    • problem = LP(c, A, b) for dense and sparse LP.
    • solution = solve(problem, alg=some_alg(options...)) for dense and sparse LP.
  • Preprocessing
    • Conversion of any LP into standard equality form
    • Idiot Crash Algorithm
  • Utilities
    • (Compressed) MPS file reader
  • Examples
    • Max Flow / Min Cut
  • Documentation
    • In-code documentation
    • Online documentation using FORD
  • Continuous integration and documentation

For the sake of simplicity, I plan to focus first on linear programs (because this is what I currently need) and to support only double precision arithmetic for now (this can easily be extended later on using fypp). The current choice of algorithms is semi-arbitrary and mostly comprises those I am somewhat familiar with. Anyone more knowledgeable with LP is more than welcome to give their two cents (or even better, contribute :slight_smile: )!

14 Likes

Hi, out of curiosity, are you planning on a parallel (MPI) implementation? There is a lack of distributed convex programming codes.

1 Like

In an ideal world, I would say yes, for sure. Open-source/easy-to-use parallelized convex solvers are indeed missing. LightConvex is however in its very very very early stage beside being only a side project for me at the moment. My two most important short-term goals are:

  1. Focus on the dense linear programs and make sure the implementations of the algorithms are both very clean (i.e. easy to read for someone with a minimum background in LP) and have fairly good performances.
  2. Hone the design of the high-level interfaces to ensure 2 things:
    a. Easy to use for most cases and sufficiently flexible to support as many algorithms as possible, e.g. a solution = solve(problem, alg=some_alg(options...)) API has my preference at the moment.
    b. Make sure the design is good enough and well-thought enough to limit the risk of a bad design decision biting us in the ass later on.

Medium-term goals are the support for sparse LP (serial implementations only). I can’t really tell what short-term and medium-term really imply in real-life as I have only relatively little time to work on this project (in particular during the Fall semester where I have most of my teaching load). Additionally, the parallelization really makes sense only for problems relying on sparse linear algebra. I plan to rely as much as possible on stdlib so even the development of the serial sparse solvers will actually depend on how far the support for sparse linear algebra has come to in stdlib.

In any case, the short answer as of today is thus “Sure ! But not in the near future”. For the moment, my development plan is pretty linear, focusing only on the serial implementations, and goes roughly as follows:

Dense LP sovlers --> Sparse LP solvers --> Dense QP solvers --> Sparse QP solvers --> ?

How long it’ll take and what would come next, I have no idea (even more so while working alone on this). In the mean time, if you really need a parallelized solver, I would probably go for an ADMM approach which (if my understanding is correct) is naturally well-suited for distributed optimization and shouldn’t be too complicated to implement in parallel in practice. Here’s a C implementation specialized for the LASSO problem.

1 Like