f ( x ) = [ f 1 = x 1 2 + 2 x 2 f 2 = 3 x 1 + 4 x 2 2 ] , J = [ ∂ f 1 ∂ x 1 ∂ f 1 ∂ x 2 ∂ f 2 ∂ x 1 ∂ f 2 ∂ x 2 ] = [ 2 x 1 2 3 8 x 2 ] \begin{equation} f(x)=\begin{bmatrix} f_1=x_1^2+2x_2\\\\f_2=3x_1+4x_2^2\end{bmatrix}, J=\begin{bmatrix} \frac{\partial f_1}{\partial x_1}&\frac{\partial f_1}{\partial x_2}\\\\ \frac{\partial f_2}{\partial x_1}&\frac{\partial f_2}{\partial x_2} \end{bmatrix}=\begin{bmatrix} 2x_1&2\\\\3&8x_2\end{bmatrix} \end{equation} f(x)= f1=x12+2x2f2=3x1+4x22 ,J= ∂x1∂f1∂x1∂f2∂x2∂f1∂x2∂f2 = 2x1328x2
- 我们假设
x
1
=
1
,
x
2
=
2
x_1=1,x_2=2
x1=1,x2=2,可得Jacobian matrix 表示如下:
J = [ 2 x 1 2 3 8 x 2 ] = [ 2 2 3 16 ] \begin{equation} J=\begin{bmatrix} 2x_1&2\\\\3&8x_2\end{bmatrix}=\begin{bmatrix} 2&2\\\\3&16\end{bmatrix} \end{equation} J= 2x1328x2 = 23216 - 引用 pytorch 进行代码验证
import torch
# define f(x)
def f(x):
return torch.stack([x[0] ** 2 + 2 * x[1], 3 * x[0] + 4 * x[1] ** 2])
x = torch.tensor([1.0, 2.0], dtype=torch.float, requires_grad=True)
jacobian_matrix = torch.autograd.functional.jacobian(f, x)
print(f"jacobian_matrix=\n{jacobian_matrix}")
- 运行结果:
jacobian_matrix=
tensor([[ 2., 2.],
[ 3., 16.]])