def f(x):
return x * x - 4 * np.sin(x)
# Initial interval, assume f(a)*f(b)<0.
a = 1
b = 3
tol = 1e-3
# Bisection search.
while b - a > tol:
print('a={:.5f} b={:.5f} f(a)={:.5f} f(b)={:.5f}'
.format(a, b, f(a), f(b)))
c = 0.5 * (b + a)
if f(a) * f(c) < 0:
b = c
else:
a = c
a=1.00000 b=3.00000 f(a)=-2.36588 f(b)=8.43552
a=1.00000 b=2.00000 f(a)=-2.36588 f(b)=0.36281
a=1.50000 b=2.00000 f(a)=-1.73998 f(b)=0.36281
a=1.75000 b=2.00000 f(a)=-0.87344 f(b)=0.36281
a=1.87500 b=2.00000 f(a)=-0.30072 f(b)=0.36281
a=1.87500 b=1.93750 f(a)=-0.30072 f(b)=0.01985
a=1.90625 b=1.93750 f(a)=-0.14326 f(b)=0.01985
a=1.92188 b=1.93750 f(a)=-0.06241 f(b)=0.01985
a=1.92969 b=1.93750 f(a)=-0.02145 f(b)=0.01985
a=1.93359 b=1.93750 f(a)=-0.00085 f(b)=0.01985
a=1.93359 b=1.93555 f(a)=-0.00085 f(b)=0.00949
Given an objective function $f : \mathbb{R}^n \to \mathbb{R}$ and a set $S \subset \mathbb{R}^n$,
find $x^\ast \in S$ such that $f(x^\ast) \leq f(x)$ $\forall x \in S$
\[ \min_x f(x_1,x_2) = 2\pi x_1(x_1 + x_2) \] \[ \text{ subject to } g(x_1,x_2) = \pi x_1^2 x_2 - V = 0 \]
scipy.optimize.fsolve()
that implements Powell’s hybrid method (combination of Newton and gradient descent) by calling HYBRD or HYBRJ from Fortran library MINPACKProof (2/2):
1:$\hspace{0em}$choose initial guess $x_0$
2:$\hspace{0em}$for $k = 0,1,2,\ldots$ do
3:$\hspace{1.2em}$$s_k = -\nabla f(x_k)$
4:$\hspace{1.2em}$choose $\eta_k$ to minimize $f(x_k + \eta_k s_k)$
5:$\hspace{1.2em}$$x_{k+1} = x_k + \eta_k s_k$
6:$\hspace{0em}$end for
1:$\hspace{0em}$choose initial guess $x_0$
2:$\hspace{0em}$for $k = 0,1,2,\ldots$ do
3:$\hspace{1.2em}$solve $H_f(x_k)s_k = -\nabla f(x_k)$
4:$\hspace{1.2em}$$x_{k+1} = x_k + s_k$
5:$\hspace{0em}$end for
1:$\hspace{0em}$choose initial guess $x_0$
2:$\hspace{0em}$choose $B_0$, initial guess for Hessian, e.g. $B_0 = {\rm I}$
3:$\hspace{0em}$for $k = 0,1,2,\ldots$ do
4:$\hspace{1.2em}$solve $B_k s_k = -\nabla f(x_k)$
5:$\hspace{1.2em}$$x_{k+1} = x_k + s_k$
6:$\hspace{1.2em}$$y_k = \nabla f(x_{k+1}) - \nabla f(x_k)$
7:$\hspace{1.2em}$$B_{k+1} = B_k + \Delta B_k$
8:$\hspace{0em}$end for
where $\Delta B_k = \frac{y_ky_k^T}{y_k^Ts_k} - \frac{B_k s_k s_k^T B_k}{s_k^T B_ks_k}$
scipy.optimize.fmin_bfgs()
1:$\hspace{0em}$choose initial guess $x_0$
2:$\hspace{0em}$choose $H_0$, initial guess for inverse Hessian, e.g. $H_0 = {\rm I}$
3:$\hspace{0em}$for $k = 0,1,2,\ldots$ do
4:$\hspace{1.2em}$$s_k = - H_k \nabla f(x_k)$
5:$\hspace{1.2em}$$x_{k+1} = x_k + s_k$
6:$\hspace{1.2em}$$y_k = \nabla f(x_{k+1}) - \nabla f(x_k)$
7:$\hspace{1.2em}$$H_{k+1} = (I-\rho_k s_k y_k^T)H_k(I-\rho_k y_k s_k^T) + \rho_k s_k s_k^T$
8:$\hspace{0em}$end for
where $\rho_k=\frac{1}{y_k^T s_k}$
\[ \min_x f(x_1,x_2) = 2\pi x_1(x_1 + x_2) \] \[ \text{ subject to } g(x_1,x_2) = \pi x_1^2 x_2 - V = 0 \]
scipy.optimize.linprog
uses the HiGHS library that