can anyone show me examples or reviews for constrained nonlinear optimization in Microsoft Solver foundation 3.0? How\'s it compared to Matlab\'s fmincon? Or is there any better
I have recently ported Michael Powell's derivative-free codes COBYLA2 (non-linear objective function, non-linear constraints) and BOBYQA (non-linear objective function, variable bounds) to C#. When the optimization problem only contains variable bounds, the BOBYQA algorithm is substantially faster.
I have open-sourced both codes; you can find them on Github: cscobyla and csbobyqa.
If you prefer a derivative-based algorithm, I have also implemented an adapter to IPOPT. It is called csipopt and can be obtained from Github as well.
There is no Solver Foundation interface developed for any of these algorithms, and I cannot say how well they compare with fmincon (I am not a Matlab user myself) but hopefully the codes can be of some help in your optimization work.
I don't have much experience with Microsoft Solver Foundation myself, but there is a nice article that demonstrate how to use it from F#:
For F#, there is also an embedded modeling language - this allows you to just write your constraints as ordinary F# expressions (wrapped in quotations) and the interpreter for this language calls Microsoft Solver Foundation with appropriate constraints created (I think this is totally awesome!):
IMPORTANT UPDATE on Feb 25, 2012:
MSF 3.1 now supports nonlinear optimization with bounded variables via its NelderMeadSolver solver: http://msdn.microsoft.com/en-us/library/hh404037(v=vs.93).aspx
For general linear constraints, Microsoft solver foundation only support linear programming and quadratic programming via its interior point solver. For this solver, please see the SVM post mentioned by Tomas.
MSF has a general nonlinear programming solver, Limited-Memory-BFGS, however which does not support any constraint. This solver also requires an explicit gradient function. For this solver, please see:
Logistic regression in F# using MSF
F# ODSL mentioned by Tomas only supports linear programming. I have a QP extension for it, available at codexplex.
Back to your question - optimize f(x) with linear constraints (similar to fmincon
), I haven't seen any free library which has this ability. NMath.NET (commercial) seems to have one. I tried that for solving a highly nonlinear optimization, but it does not work for me. At last I resorted to B-LBFGS implemented in DotNumerics.
I think you will also be interested in the following SO question:
Open source alternative to MATLAB's fmincon function?
The answers point to SciPy.optimize.cobyla
, which seems to be something similar to fmincon
. But the main message is that for your specific problem, maybe fmincon
is too general. You can use a more specific solver, e.g. LBFGS or QP. Also general solvers sometimes do not work if your initial value is not good.
I realize this is an old question, but the answers here are inaccurate and/or out of date. Here is the definitive tutorial on how to use the constrained nonlinear solver in MSSF:
This example uses the default nonlinear solver which is called the HybridLocalSearchSover.
(However, I am not familiar with fmincon, so I can't speak to that.)