MATH06, Nonlinear least square problems
Back to the previous page|Optimization
List of posts to read before reading this article
Contents
the Levenberg-Marquardt method for nonlinear least square problems
In general, a least square problem can be viewed as an optimization problem with the objective function. Nonlinear least square optimization problem has a specific structure, and several methods that are tailored to solve this particular optimization problem have been developed. One example, the Levenberg-Marquardt method is based on the idea of successive linearizations of the problem in each iteration.
import numpy as np
from scipy import optimize
import matplotlib.pyplot as plt
beta = (0.25, 0.75, 0.5)
# true model : f(xdata, *beta)
def f(x, b0, b1, b2):
return b0 + b1 * np.exp(-b2 * x**2)
xdata = np.linspace(0, 5, 50)
y = f(xdata, *beta)
# input data : ydata
ydata = y + 0.05 * np.random.randn(len(xdata))
# residual(deviation) : g(beta) = ydata - f(xdata, *beta)
def g(beta):
return ydata - f(xdata, *beta)
# optimization for beta : beta_opt
beta_start = (1, 1, 1)
beta_opt, beta_cov = optimize.leastsq(g, beta_start)
# visualization
fig, ax = plt.subplots()
ax.scatter(xdata, ydata, label='samples')
ax.plot(xdata, y, 'r', lw=2, label='true model')
ax.plot(xdata, f(xdata, *beta_opt), 'b', lw=2, label='fitted model')
ax.set_xlim(0, 5)
ax.set_xlabel(r"$x$", fontsize=18)
ax.set_ylabel(r"$f(x, \beta)$", fontsize=18)
ax.legend()
plt.show()
SUPPLEMENT
beta_opt
OUTPUT
: \([0.24852741, 0.77109938, 0.49358439]\)
beta_cov
OUTPUT
: \(1\)
List of posts followed by this article
Reference