需要传入函数和函数的输入
import torch
from torch.autograd.functional import jacobian
def func(x):
return x.exp().sum(dim=1)
x = torch.randn(2, 3)
y = func(x)
print(x)
'''
tensor([[-0.2497, -0.8842, 0.6314],
[-0.0687, -1.5360, 1.4695]])
'''
print(y) # tensor([3.0724, 5.4959])
# exp(-0.2497)+exp(-0.8842)+exp(0.6314)=3.0724
# exp(-0.0687)+exp(-1.5360)+exp(1.4695)=5.4959
print(jacobian(func, x))
输出结果是:
tensor([[-0.2497, -0.8842, 0.6314],
[-0.0687, -1.5360, 1.4695]])
tensor([3.0724, 5.4959])
tensor([[[0.7791, 0.4130, 1.8803],
[0.0000, 0.0000, 0.0000]],
[[0.0000, 0.0000, 0.0000],
[0.9336, 0.2152, 4.3470]]])
暂记:
y
1
=
e
x
p
(
−
0.2497
)
+
e
x
p
(
−
0.8842
)
+
e
x
p
(
0.6314
)
=
3.0724
y1=exp(-0.2497)+exp(-0.8842)+exp(0.6314)=3.0724
y1=exp(−0.2497)+exp(−0.8842)+exp(0.6314)=3.0724
y
2
=
e
x
p
(
−
0.0687
)
+
e
x
p
(
−
1.5360
)
+
e
x
p
(
1.4695
)
=
5.4959
y2=exp(-0.0687)+exp(-1.5360)+exp(1.4695)=5.4959
y2=exp(−0.0687)+exp(−1.5360)+exp(1.4695)=5.4959
输出结果的雅可比行列式中,第一行分别为:
∂
y
1
/
∂
x
11
,
∂
y
1
/
∂
x
12
,
∂
y
1
/
∂
x
13
∂y_1/∂x_{11},∂y_1/∂x_{12},∂y_1/∂x_{13}
∂y1/∂x11,∂y1/∂x12,∂y1/∂x13,即分别为:exp(-0.2497),exp(-0.8842),exp(0.6314)。
第二行为:
∂
y
1
/
∂
x
21
,
∂
y
1
/
∂
x
22
,
∂
y
1
/
∂
x
23
∂y_1/∂x_{21},∂y_1/∂x_{22},∂y_1/∂x_{23}
∂y1/∂x21,∂y1/∂x22,∂y1/∂x23。依次类推,剩下的是
y
2
y_2
y2对x的偏导