What is the difference between variable_scope
and name_scope
? The variable scope tutorial talks about variable_scope
implicitly opening name_scope
. I also noticed that creating a variable in a name_scope
automatically expands its name with the scope name as well. So what is the difference?
variable_scope和name_scope有什么区别?变量范围教程讨论了variable_scope隐式打开name_scope。我还注意到,在name_scope中创建变量也会自动使用作用域名称扩展其名称。那么区别是什么呢?
2 个解决方案
#1
39
When you create a variable with tf.get_variable
instead of tf.Variable
, Tensorflow will start checking the names of the vars created with the same method to see if they collide. If they do, an exception will be raised. If you created a var with tf.get_variable
and you try to change the prefix of your variable names by using the tf.name_scope
context manager, this won't prevent the Tensorflow of raising an exception. Only tf.variable_scope
context manager will effectively change the name of your var in this case. Or if you want to reuse the variable you should call scope.reuse_variables() before creating the var the second time.
使用tf.get_variable而不是tf.Variable创建变量时,Tensorflow将开始检查使用相同方法创建的变量的名称,以查看它们是否发生冲突。如果他们这样做,将引发例外。如果使用tf.get_variable创建了一个var,并且尝试使用tf.name_scope上下文管理器更改变量名的前缀,则不会阻止Tensorflow引发异常。在这种情况下,只有tf.variable_scope上下文管理器才能有效地更改var的名称。或者,如果要重用该变量,则应在第二次创建var之前调用scope.reuse_variables()。
In summary, tf.name_scope
just add a prefix to all tensor created in that scope (except the vars created with tf.get_variable
), and tf.variable_scope
add a prefix to the variables created with tf.get_variable
.
总之,tf.name_scope只是为在该范围内创建的所有张量添加前缀(除了使用tf.get_variable创建的变量),并且tf.variable_scope为使用tf.get_variable创建的变量添加前缀。
#2
37
I had problems understanding the difference between variable_scope and name_scope (they looked almost the same) before I tried to visualize everything by creating a simple example:
在我尝试通过创建一个简单示例来可视化所有内容之前,我在理解variable_scope和name_scope之间的区别(它们看起来几乎相同)时遇到了问题:
import tensorflow as tf
def scoping(fn, scope1, scope2, vals):
with fn(scope1):
a = tf.Variable(vals[0], name='a')
b = tf.get_variable('b', initializer=vals[1])
c = tf.constant(vals[2], name='c')
with fn(scope2):
d = tf.add(a * b, c, name='res')
print '\n '.join([scope1, a.name, b.name, c.name, d.name]), '\n'
return d
d1 = scoping(tf.variable_scope, 'scope_vars', 'res', [1, 2, 3])
d2 = scoping(tf.name_scope, 'scope_name', 'res', [1, 2, 3])
with tf.Session() as sess:
writer = tf.summary.FileWriter('logs', sess.graph)
sess.run(tf.global_variables_initializer())
print sess.run([d1, d2])
writer.close()
Here I create a function that creates some variables and constants and groups them in scopes (depending by the type I provided). In this function I also print the names of all the variables. After that I executes the graph to get values of the resulting values and save event-files to investigate them in tensorboard. If you run this, you will get the following:
在这里,我创建了一个函数,它创建了一些变量和常量,并在范围内对它们进行分组(取决于我提供的类型)。在这个函数中,我还打印所有变量的名称。之后,我执行图形以获取结果值的值并保存事件文件以在tensorboard中调查它们。如果你运行它,你将得到以下内容:
scope_vars
scope_vars/a:0
scope_vars/b:0
scope_vars/c:0
scope_vars/res/res:0
scope_name
scope_name/a:0
b:0
scope_name/c:0
scope_name/res/res:0
You see the similar pattern if you open TB (as you see b
is outside of scope_name
rectangular):
如果打开TB,则会看到类似的模式(如您所见,b在scope_name矩形之外):
This gives you the answer:
这给你答案:
Now you see that tf.variable_scope()
adds a prefix to the names of all variables (no matter how you create them), ops, constants. On the other hand tf.name_scope()
ignores variables created with tf.get_variable()
because it assumes that you know which variable and in which scope you wanted to use.
现在你看到tf.variable_scope()为所有变量的名称添加了一个前缀(无论你如何创建它们),ops,常量。另一方面,tf.name_scope()忽略使用tf.get_variable()创建的变量,因为它假定您知道要使用哪个变量以及在哪个范围内。
A good documentation on Sharing variables tells you that
关于共享变量的好文档告诉你
tf.variable_scope()
: Manages namespaces for names passed totf.get_variable()
.tf.variable_scope():管理传递给tf.get_variable()的名称的名称空间。
The same documentation provides a more details how does Variable Scope work and when it is useful.
相同的文档提供了有关变量范围如何工作以及何时有用的更多详细信息。
#1
39
When you create a variable with tf.get_variable
instead of tf.Variable
, Tensorflow will start checking the names of the vars created with the same method to see if they collide. If they do, an exception will be raised. If you created a var with tf.get_variable
and you try to change the prefix of your variable names by using the tf.name_scope
context manager, this won't prevent the Tensorflow of raising an exception. Only tf.variable_scope
context manager will effectively change the name of your var in this case. Or if you want to reuse the variable you should call scope.reuse_variables() before creating the var the second time.
使用tf.get_variable而不是tf.Variable创建变量时,Tensorflow将开始检查使用相同方法创建的变量的名称,以查看它们是否发生冲突。如果他们这样做,将引发例外。如果使用tf.get_variable创建了一个var,并且尝试使用tf.name_scope上下文管理器更改变量名的前缀,则不会阻止Tensorflow引发异常。在这种情况下,只有tf.variable_scope上下文管理器才能有效地更改var的名称。或者,如果要重用该变量,则应在第二次创建var之前调用scope.reuse_variables()。
In summary, tf.name_scope
just add a prefix to all tensor created in that scope (except the vars created with tf.get_variable
), and tf.variable_scope
add a prefix to the variables created with tf.get_variable
.
总之,tf.name_scope只是为在该范围内创建的所有张量添加前缀(除了使用tf.get_variable创建的变量),并且tf.variable_scope为使用tf.get_variable创建的变量添加前缀。
#2
37
I had problems understanding the difference between variable_scope and name_scope (they looked almost the same) before I tried to visualize everything by creating a simple example:
在我尝试通过创建一个简单示例来可视化所有内容之前,我在理解variable_scope和name_scope之间的区别(它们看起来几乎相同)时遇到了问题:
import tensorflow as tf
def scoping(fn, scope1, scope2, vals):
with fn(scope1):
a = tf.Variable(vals[0], name='a')
b = tf.get_variable('b', initializer=vals[1])
c = tf.constant(vals[2], name='c')
with fn(scope2):
d = tf.add(a * b, c, name='res')
print '\n '.join([scope1, a.name, b.name, c.name, d.name]), '\n'
return d
d1 = scoping(tf.variable_scope, 'scope_vars', 'res', [1, 2, 3])
d2 = scoping(tf.name_scope, 'scope_name', 'res', [1, 2, 3])
with tf.Session() as sess:
writer = tf.summary.FileWriter('logs', sess.graph)
sess.run(tf.global_variables_initializer())
print sess.run([d1, d2])
writer.close()
Here I create a function that creates some variables and constants and groups them in scopes (depending by the type I provided). In this function I also print the names of all the variables. After that I executes the graph to get values of the resulting values and save event-files to investigate them in tensorboard. If you run this, you will get the following:
在这里,我创建了一个函数,它创建了一些变量和常量,并在范围内对它们进行分组(取决于我提供的类型)。在这个函数中,我还打印所有变量的名称。之后,我执行图形以获取结果值的值并保存事件文件以在tensorboard中调查它们。如果你运行它,你将得到以下内容:
scope_vars
scope_vars/a:0
scope_vars/b:0
scope_vars/c:0
scope_vars/res/res:0
scope_name
scope_name/a:0
b:0
scope_name/c:0
scope_name/res/res:0
You see the similar pattern if you open TB (as you see b
is outside of scope_name
rectangular):
如果打开TB,则会看到类似的模式(如您所见,b在scope_name矩形之外):
This gives you the answer:
这给你答案:
Now you see that tf.variable_scope()
adds a prefix to the names of all variables (no matter how you create them), ops, constants. On the other hand tf.name_scope()
ignores variables created with tf.get_variable()
because it assumes that you know which variable and in which scope you wanted to use.
现在你看到tf.variable_scope()为所有变量的名称添加了一个前缀(无论你如何创建它们),ops,常量。另一方面,tf.name_scope()忽略使用tf.get_variable()创建的变量,因为它假定您知道要使用哪个变量以及在哪个范围内。
A good documentation on Sharing variables tells you that
关于共享变量的好文档告诉你
tf.variable_scope()
: Manages namespaces for names passed totf.get_variable()
.tf.variable_scope():管理传递给tf.get_variable()的名称的名称空间。
The same documentation provides a more details how does Variable Scope work and when it is useful.
相同的文档提供了有关变量范围如何工作以及何时有用的更多详细信息。