I’m working with the following scipy code.
import numpy as np
from scipy.optimize import basinhopping
n_iter = 100 
@np.vectorize
def f(x):
    return ( x * np.sin(x) + 2*x) ** 2
x0 = -6
minimizer_kwargs = {"method": "BFGS"}
ret = basinhopping(f, x0, minimizer_kwargs=minimizer_kwargs, niter=n_iter)
print("global minimum: x = %.4f, f(x0) = %.4f" % (ret.x, ret.fun))
The global minimum of this function is at 0, but this isn’t what basin hopping returns. Depending on the start position x0, it returns different local minima – not the global one at 0. If we set x_0 = -6, it returns a minima at -7.7, if we set x0 = 1, then it returns a minima at 0, so on.
Why is it not returning the global minima? Why is it returning the local minimum closest to its start position?
Advertisement
Answer
If you increase n_iter to 1000 it works!
The output is
"global minimum: x = 0.0000, f(x0) = 0.0000"
It is a stochastic algorithm and in this case it requires some more attempts, indeed using
import numpy as np
from scipy.optimize import basinhopping
while True:
    n_iter = 850 
    @np.vectorize
    def f(x):
        return ( x * np.sin(x) + 2*x) ** 2
    x0 = -6
    minimizer_kwargs = {"method": "BFGS"}
    ret = basinhopping(f, x0, minimizer_kwargs=minimizer_kwargs, niter=n_iter)
    print("global minimum: x = %.4f, f(x0) = %.4f" % (ret.x, ret.fun))
prints
""" global minimum: x = -7.7230, f(x0) = 60.6709 global minimum: x = -0.0000, f(x0) = 0.0000 global minimum: x = -0.0000, f(x0) = 0.0000 global minimum: x = 0.0000, f(x0) = 0.0000 global minimum: x = -0.0000, f(x0) = 0.0000 global minimum: x = 0.0000, f(x0) = 0.0000 global minimum: x = -0.0000, f(x0) = 0.0000 global minimum: x = -0.0000, f(x0) = 0.0000 global minimum: x = -0.0000, f(x0) = 0.0000 global minimum: x = -7.7230, f(x0) = 60.6709 ... """
Not always the algorithm with n_iter=850 finds the global minimum.