tensorflow 的 control_dependencies函数

tensorflow.control_dependencies( tasklist )可以把tasklist里的操作作为预定操作。一般是这样使用:

with tensorflow.control_dependencies( tasklist ) :
    trainOp =  tensorflow.no_op() 

然后执行这个 trainOp (即Tensorflow.Session().run( trainOp))时,就会把 tasklist所包含的动作先执行掉.

但是在with里面不能直接赋值别的操作,那样的话,就不再执行tasklist里的动作了.比如有个动作 task1 ,它不在tasklist里面,也与tasklist不相关.但是在with代码段里面将task1直接赋值给trainOP,那么不会执行tasklist包含的动作: 

with tensorflow.control_dependencies( tasklist ) :
    trainOp =  task1 

可以这样修改,将新加入的动作和no_动作一起放在list里,达到执行tasklist里包含的动作:

with tensorflow.control_dependencies( tasklist ) :
    trainOp =  tensorflow.no_op()
    trainOp = [trainOp , task1]

下面是一段验证代码:

import tensorflow as tf
a = tf.Variable( 2 )
selfAdd = tf.Variable( 0 ) 
selfAddition = tf.assign_add(  selfAdd , 3    )
selfSub = tf.Variable( 0 ) 
selfSubtraction = tf.assign_sub(  selfSub , 2    )
b = tf.multiply( a , selfAdd )
with  tf.control_dependencies( [ selfAddition ] ) :
    train_op  =  tf.no_op() #待解释语句1
    print( train_op ) 
    train_op  =[ train_op ,  selfSubtraction] #待解释语句2
    print( train_op )    
with tf.Session() as sess :
    init = tf.global_variables_initializer()
    sess.run( init )
    for i in range( 20 ) :
        sess.run( train_op ) #待解释语句3
        print( "selfAdd:" , sess.run(  selfAdd ) )
    ra = sess.run(selfAdd )
    rb = sess.run(b )
    print( '@end selfAdd:' ,ra )
    rs = sess.run(selfSub)
    print('@end selfSub:' , rs ) 
    print( 'b:' , rb ) 

sefAddition 是对变量selfAdd进行自加的操作,selfSubtraction是对变量selfSub进行自减的操作. "待解释语句1"建立了一个trainOp,它本身不做什么,但是它会定义预处理动作, 就是control_dependencies调用时实参所指定的 selfAddition. 若想除定义预处理动作之外,还想再加其他动作,可按照"待解释语句2"的方法,在列表里增加动作.这样在"待解释语句3"里每次动执行 selfAddition和selfSubtraction.

必须在no_op之后再加动作,似乎和一般理解不一样,不知道是不是哪里有不对的地方,下面是我的测试代码和输出:

import tensorflow as tf
a = tf.Variable( 2 )
selfAdd = tf.Variable( 0 ) 
selfAddition = tf.assign_add(  selfAdd , 3    )
selfSub = tf.Variable( 0 ) 
selfSubtraction = tf.assign_sub(  selfSub , 2    )

b = tf.multiply( a , selfAdd )

with  tf.control_dependencies( [ selfAddition ] ) :
    train_op  = selfSubtraction
    print( train_op )    
with tf.Session() as sess :
    init = tf.global_variables_initializer()
    sess.run( init )
    for i in range( 20 ) :
        sess.run( train_op )
        print( "selfAdd:" , sess.run(  selfAdd ) )
    ra = sess.run(selfAdd )
    rb = sess.run(b )
    print( '@end selfAdd:' ,ra )
    rs = sess.run(selfSub)
    print('@end selfSub:' , rs ) 
    print( 'b:' , rb )
Tensor("AssignSub:0", shape=(), dtype=int32_ref)
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
@end selfAdd: 0
@end selfSub: -40
b: 0

经过分析,发现是要在 with tf.control_dependencies 里的代码块里直接定义 operation ,然后执行代码块里的operation就会执行dependeccies里的operation列表。 在外面定义,然后在with 里的代码块中转等于就不会起作用。下面代码1可以起作用,代码2不起作用。

代码1:
import tensorflow as tf

a = tf.Variable( 2 )
selfAdd = tf.Variable( 0 )
selfAddition =tf.assign_add(  selfAdd , 3  )
selfSub = tf.Variable( 0 ) 
#selfSubtraction1 = tf.assign_sub(  selfSub , 2  )
#print('op 1:', selfSubtraction1 ) 
 
with  tf.control_dependencies( [ selfAddition ] ) :
    selfSubtraction = tf.assign_sub(  selfSub , 2  )  
    print('op:', selfSubtraction )    
with tf.Session() as sess :
    init = tf.global_variables_initializer()
    sess.run( init )
    for i in range( 20 ) :
        sess.run( selfSubtraction ) 
        print( "selfAdd:" , sess.run(  selfAdd ) )
    ra = sess.run(selfAdd )
    print( '@end selfAdd:' ,ra )
    rs = sess.run(selfSub)
    print('@end selfSub:' , rs ) 
代码2
import tensorflow as tf

a = tf.Variable( 2 )
selfAdd = tf.Variable( 0 )
selfAddition =tf.assign_add(  selfAdd , 3  )
selfSub = tf.Variable( 0 ) 
selfSubtraction1 = tf.assign_sub(  selfSub , 2  )
print('op 1:', selfSubtraction1 ) 
 
with  tf.control_dependencies( [ selfAddition ] ) :
    selfSubtraction = selfSubtraction1 # tf.assign_sub(  selfSub , 2  )  
    print('op:', selfSubtraction )    
with tf.Session() as sess :
    init = tf.global_variables_initializer()
    sess.run( init )
    for i in range( 20 ) :
        sess.run( selfSubtraction ) 
        print( "selfAdd:" , sess.run(  selfAdd ) )
    ra = sess.run(selfAdd )
    print( '@end selfAdd:' ,ra )
    rs = sess.run(selfSub)
    print('@end selfSub:' , rs ) 
 

参考:

 

https://www.cnblogs.com/qjoanven/p/7736025.html

 

 

 

 

 

你可能感兴趣的:(python,tensorflow)