Claire-G
0
Q:

fmincon vs fminsearch

Those two functions are very different. Fminsearch implements the Nelder-Mead algorithm, which doesn't need to know the gradient of your error function. It's very flexible, and can even handle discontinuous error functions. It's pretty impressive, and tends to give good results without a lot of setup.

On the other hand, it can't deal with explicit boundaries, and for large numbers of parameters, it gets inefficient.

Fmincon implements a variety of bounded optimizations, mostly of the sort which are extensions of Newton's method. It takes more preparation, but generally speaking the final results are very accurate. If you can compute an analytical gradient for your error function, these methods are also very fast and efficient. If you can't, they will still work, but much less efficiently because they must use finite differences.

The real equivalent to fminsearch for gradient-aware optimization is fminunc, which implements Newton's method and some extensions of it.

All nonlinear optimization requires a decent starting point (unless it's convex). Local minima can always be a problem, but usually some reasonable efforts to compute a starting guess will fix that issue.
0

New to Communities?

Join the community