TensorFlow1.2函数原型
https://www.tensorflow.org/versions/r1.2/api_docs/python/tf/one_hot
tf.one_hot
one_hot(
indices,
depth, #标量,指定了每个one-hot的dimension
on_value=None, #默认为1
off_value=None, #默认为0
axis=None, #默认为-1
dtype=None, #当on_value, off_value没有指定时,默认为tf.float32
name=None
)数据间没有序关系的时候可以用one_hot编码。
indices所代表的位置,用on_value替代,其余位置用off_value替代。
输出维度:
1. 当indices是一个标量,则输出的shape为长度为depth的向量;
tf.reset_default_graph()
indices = 1
depth = 6
one_hot = tf.one_hot(indices=indices,
depth=depth,
axis=None)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print(sess.run(one_hot))结果为:
[ 0. 1. 0. 0. 0. 0.] #单个one-hot2. 当indices是一个长度为features的向量,则输出的shape为:
[features, depth] if axis == -1
[depth, features] if axis == 0此时,axis决定了depth的位置。
2.1 axis = -1 默认值
indices = [1, 4, 3]
depth = 6
axis=-1结果为:输出shape为[features, depth],on_value在行上
[[ 0. 1. 0. 0. 0. 0.]
[ 0. 0. 0. 0. 1. 0.]
[ 0. 0. 0. 1. 0. 0.]]2.2 axis = 0
indices = [1, 4, 3]
depth = 6
axis=0结果为:输出shape为[depth,features],on_value 在列上
[[ 0. 0. 0.]
[ 1. 0. 0.]
[ 0. 0. 0.]
[ 0. 0. 1.]
[ 0. 1. 0.]
[ 0. 0. 0.]] batch x features x depth if axis == -1
batch x depth x features if axis == 1
depth x batch x features if axis == 0如 :
indices = [[1, 2],
[3, 4]] #shape为[2, 2]
depth = 63.1 axis = -1
结果为:shape为[2, 2, 6]
[[[ 0. 1. 0. 0. 0. 0.]
[ 0. 0. 1. 0. 0. 0.]]
[[ 0. 0. 0. 1. 0. 0.]
[ 0. 0. 0. 0. 1. 0.]]]3.2 axis = 1
结果为:shape为[2, 6, 2]
[[[ 0. 0.]
[ 1. 0.]
[ 0. 1.]
[ 0. 0.]
[ 0. 0.]
[ 0. 0.]]
[[ 0. 0.]
[ 0. 0.]
[ 0. 0.]
[ 1. 0.]
[ 0. 1.]
[ 0. 0.]]]3.3 axis = 0
结果为:shape为[6, 2, 2]
[[[ 0. 0.]
[ 0. 0.]]
[[ 1. 0.]
[ 0. 0.]]
[[ 0. 1.]
[ 0. 0.]]
[[ 0. 0.]
[ 1. 0.]]
[[ 0. 0.]
[ 0. 1.]]
[[ 0. 0.]
[ 0. 0.]]]总结:
输出的维度为[indices.shape, depth],(axis=-1或None),axis决定了depth的位置
以indices中元素为下标,生成一个one-hot,再以axis决定的形式组织起来。
本文介绍了TensorFlow 1.2版本中的tf.one_hot()函数,用于数据无序关系时的one_hot编码。当indices是标量时,输出为长度为depth的向量;当indices是向量时,输出维度由axis参数决定,可以是[features, depth]、[depth, features]等。总结来说,输出维度为[indices.shape, depth],axis参数影响depth的位置。"
84470685,8129757,润乾报表显示格式与显示值详解,"['润乾报表', '数据格式', '单元格属性', '报表设计']
2598

被折叠的 条评论
为什么被折叠?



