如下所示:
1
2
3
4
5
6
|
with tf.GradientTape(persistent = True ) as tape:
z1 = f(w1, w2 + 2. )
z2 = f(w1, w2 + 5. )
z3 = f(w1, w2 + 7. )
z = [z1,z3,z3]
[tape.gradient(z, [w1, w2]) for z in (z1, z2, z3)]
|
输出结果
1
2
3
4
5
6
7
8
9
10
11
12
|
[[<tf.Tensor: id = 56906 , shape = (), dtype = float32, numpy = 40.0 >,
<tf.Tensor: id = 56898 , shape = (), dtype = float32, numpy = 10.0 >],
[<tf.Tensor: id = 56919 , shape = (), dtype = float32, numpy = 46.0 >,
<tf.Tensor: id = 56911 , shape = (), dtype = float32, numpy = 10.0 >],
[<tf.Tensor: id = 56932 , shape = (), dtype = float32, numpy = 50.0 >,
<tf.Tensor: id = 56924 , shape = (), dtype = float32, numpy = 10.0 >]]
with tf.GradientTape(persistent = True ) as tape:
z1 = f(w1, w2 + 2. )
z2 = f(w1, w2 + 5. )
z3 = f(w1, w2 + 7. )
z = [z1,z2,z3]
tape.gradient(z, [w1, w2])
|
输出结果
[<tf.Tensor: id=57075, shape=(), dtype=float32, numpy=136.0>,
<tf.Tensor: id=57076, shape=(), dtype=float32, numpy=30.0>]
总结:如果对一个listz=[z1,z2,z3]求微分,其结果将自动求和,而不是返回z1、z2和z3各自对[w1,w2]的微分。
补充知识:Python/Numpy 矩阵运算符号@
如下所示:
A = np.matrix('3 1; 8 2')
B = np.matrix('6 1; 7 9')
1
2
3
|
A@B
matrix([[ 25 , 12 ],
[ 62 , 26 ]])
|
以上这篇TensorFlow Autodiff自动微分详解就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持服务器之家。
原文链接:https://www.cnblogs.com/yaos/p/12753268.html