Tensorflow(五) —— Tensor的broadcast_to操作
1 broadcast_to介绍
1.1 key idea
- insert into 1 dim ahead if needed
- expand dims with size 1 to same size
- 即先升维再扩张到目标数据位数
- broadcast 没有进行数据复制,但却完成相应数据的运算。
1.2how to understand
- when it has no axis, create a new concept(axis)
- when it has dim of size 1,
1.3 why broadcasting
- for real demading
- memory consumption
1.4 broadcastable
- Match from last dim 最右边必须对齐 位数相同或者为1
- if current dim=1,expand to same
- if either has no dim, insert one dim and expand to same
- otherwise, Not broadcastable
[4,32,14,14]
[2,32,14,14]
not broadcastable
2 隐式
"""
a = tf.random.uniform([4,28,28,3])
b = tf.constant(5.)
print("a+b:",(a+<