What is the most efficient way to flatten a 2D tensor which is actually a horizontal or vertical vector into a 1D tensor?
将2D张量实际上是水平或垂直向量展平为1D张量的最有效方法是什么?
Is there a difference in terms of performance between:
两者之间的性能差异是否存在差异:
tf.reshape(w, [-1])
and
tf.squeeze(w)
?
1 个解决方案
#1
40
Both tf.reshape(w, [-1])
and tf.squeeze(w)
are "cheap" in that they operate only on the metadata (i.e. the shape) of the given tensor, and don't modify the data itself. Of the two tf.reshape()
has slightly simpler logic internally, but the performance of the two should be indistinguishable.
tf.reshape(w,[ - 1])和tf.squeeze(w)都是“便宜的”,因为它们仅对给定张量的元数据(即形状)进行操作,并且不修改数据本身。在两个tf.reshape()内部有一些稍微简单的逻辑,但两者的性能应该是难以区分的。
#1
40
Both tf.reshape(w, [-1])
and tf.squeeze(w)
are "cheap" in that they operate only on the metadata (i.e. the shape) of the given tensor, and don't modify the data itself. Of the two tf.reshape()
has slightly simpler logic internally, but the performance of the two should be indistinguishable.
tf.reshape(w,[ - 1])和tf.squeeze(w)都是“便宜的”,因为它们仅对给定张量的元数据(即形状)进行操作,并且不修改数据本身。在两个tf.reshape()内部有一些稍微简单的逻辑,但两者的性能应该是难以区分的。