一、函数原型
tf.reduce_mean(
input_tensor,
axis=None,
keepdims=None,
name=None,
reduction_indices=None,
keep_dims=None
)
参数:
input_tensor:需要计算平均值的张量
axis:
需要计算平均值的轴,如果为None,所有的维度都需要求平均值,当然咯,如果你给他提供参数了,这个参数值必须在[-rank(input_tensor),rank(input_tensor)之间。
keepdims:
除非keep_dims是True,否则张量的秩将在axis的每个条目中减少1。如果keep_dims为True,则缩小的维度将保留为1。
name:可选,也就是操作的名字
keep_dims:keepdims弃用的化名
二、例子
(1)一般对一个维度求平均数
就拿四维的数据进行讨论,因为一般在计算机视觉的领域的数据shape都是[batch_size,height,width,channels],
import tensorflow as tf
import numpy as np
a=np.arange(24,dtype=np.float32).reshape([2,2,2,3])
a_0=tf.reduce_mean(a,[0])
a_1=tf.reduce_mean(a,[1])
a_2=tf.reduce_mean(a,[2])
a_3=tf.reduce_mean(a,[3])
with tf.Session() as sess:
print("original")
print(a)
print("reduce sum at axis=0:")
print(sess.run(a_0))
print("reduce sum at axis=1:")
print(sess.run(a_1))
print("reduce sum at axis=2:")
print(sess.run(a_2))
print("reduce sum at axis=3:")
print(sess.run(a_3))
结果:下面的结果可以看出axis=0的时候,是对四维数组的batch_size进行求和,也就是对高进行求和,也就是将多个样本按位求平均值,axis=1的时候是对height进行求和,拿第一个样本的第一个channel的矩阵来说[[0,3],[6,9]],也就是将(0+6)/2=3,(3+9)/2=6;其他就不一一说明了!
original
[[[[ 0. 1. 2.]
[ 3. 4. 5.]]
[[ 6. 7. 8.]
[ 9. 10. 11.]]]
[[[12. 13. 14.]
[15. 16. 17.]]
[[18. 19. 20.]
[21. 22. 23.]]]]
reduce sum at axis=0:
[[[ 6. 7. 8.]
[ 9. 10. 11.]]
[[12. 13. 14.]
[15. 16. 17.]]]
reduce sum at axis=1:
[[[ 3. 4. 5.]
[ 6. 7. 8.]]
[[15. 16. 17.]
[18. 19. 20.]]]
reduce sum at axis=2:
[[[ 1.5 2.5 3.5]
[ 7.5 8.5 9.5]]
[[13.5 14.5 15.5]
[19.5 20.5 21.5]]]
reduce sum at axis=3:
[[[ 1. 4.]
[ 7. 10.]]
[[13. 16.]
[19. 22.]]]
你可以试试keepdims=True的情况,其实就是保持减少的那个维度为1,也就是假设我原本的大小为[2,2,2,3],在axis=0的维度上进行求平均数,如果keepdims为False,那么得到的结果为[2,2,3],如果keepdims=True,那么最后的结果为[1,2,2,3].
import tensorflow as tf
import numpy as np
a=np.arange(24,dtype=np.float32).reshape([2,2,2,3])
a_0=tf.reduce_mean(a,[0])
a_00=tf.reduce_mean(a,[0],keepdims=True)
with tf.Session() as sess:
print("reduce sum at axis=0:keep_dims=True")
print(sess.run(tf.shape(sess.run(a_00))))
print("reduce sum at axis=0:keep_dims=False")
print(sess.run(tf.shape(sess.run(a_0))))
结果:
reduce sum at axis=0:keep_dims=True
[1 2 2 3]
reduce sum at axis=0:keep_dims=False
[2 2 3]
(2)在图像领域,对第二维度和第三维度求平均数
代码:三种方式得到的结果是一致的!
import tensorflow as tf
import numpy as np
a=np.arange(24,dtype=np.float32).reshape([2,2,2,3])
a_1=tf.reduce_mean(a,[1])
a_1_1=tf.reduce_mean(a_1,[1])
a_12=tf.reduce_mean(a,[1,2])
a_1t=tf.reduce_mean(a,[1],keepdims=True)
a_1_2=tf.reduce_mean(a_1t,[2])
with tf.Session() as sess:
print("original:")
print(a)
print("reduce_mean at axis=1:")
print(sess.run(a_1))
print("reduce mean at axis=1 first,then at axis=1:which is equal to [1,2]")
print(sess.run(a_1_1))
print("or reduce mean at axis=1 with keepdims=True,then at axis=2")
print(sess.run(a_1_2))
print("reduce_mean at axis=1 and 2")
print(sess.run(a_12))
结果:
original:
[[[[ 0. 1. 2.]
[ 3. 4. 5.]]
[[ 6. 7. 8.]
[ 9. 10. 11.]]]
[[[12. 13. 14.]
[15. 16. 17.]]
[[18. 19. 20.]
[21. 22. 23.]]]]
reduce_mean at axis=1:
[[[ 3. 4. 5.]
[ 6. 7. 8.]]
[[15. 16. 17.]
[18. 19. 20.]]]
reduce mean at axis=1 first,then at axis=1:which is equal to [1,2]
[[ 4.5 5.5 6.5]
[16.5 17.5 18.5]]
or reduce mean at axis=1 with keepdims=True,then at axis=2
[[[ 4.5 5.5 6.5]]
[[16.5 17.5 18.5]]]
reduce_mean at axis=1 and 2
[[ 4.5 5.5 6.5]
[16.5 17.5 18.5]]