最近再看retinanet的keras代码,看到focal loss中有这个的用法,记录下来
先看其一般用法
1.tf.where(tensor)
tensor为一个bool型张量,where函数将返回其中为true的元素的索引,
import tensorflow as tf
import numpy as np
sess=tf.Session()
a=np.array([[1,0,0],[0,1,1]])
print(sess.run(tf.where(tf.equal(a,1))))#返回a中元素=1的那些元素的索引值a[0,0],a[1,1],a[1,2]
#[[0 0]
# [1 1]
# [1 2]]
2. tf.where(tensor,a,b)
a,b为和tensor相同维度的tensor,将tensor中的true位置元素替换为a中对应位置元素,false的替换为b中对应位置元素.例子
import tensorflow as tf
import numpy as np
sess=tf.Session()
a=np.array([[1,0,0],[0,1,1]])
a1=np.array([[3,2,3],[4,5,6]])
print(sess.run(tf.equal(a,1)))#[[ True False False]
# [False True True]]
print(sess.run(tf.where(tf.equal(a,1),a1,1-a1)))
#[[ 3 -1 -2]
# [-3 5 6]]
应用场景,retinanet的focal loss
alpha_factor = keras.backend.ones_like(labels) * alpha
alpha_factor = backend.where(keras.backend.equal(labels, 1), alpha_factor, 1 - alpha_factor)
实现了什么呢
import tensorflow as tf
import keras
sess=tf.Session()
labels=[[1],[0]]
alpha=0.25
alpha_factor = [[alpha], [alpha]]
b=[[1-alpha], [1-alpha]]
alpha_factor = tf.where(keras.backend.equal(labels, 1), alpha_factor, b)
print(sess.run(alpha_factor))#[[ 0.25]
# [ 0.75]]