sequence_length: (optional) An int32/int64 vector, size `[batch_size]`,
containing the actual lengths for each of the sequences in the batch. If
not provided, all batch entries are assumed to be full sequences; and time
reversal is applied from time `0` to `max_time` for each sequence.
形象的来说就是
len(sequence_length) = batch_size;格式[30,30,30,30,……],用来表明每个样本的时间步长度。即sequence_length
重要注意事项:
在推理函数中:
sentence_length_stacked = [max_sentence_length]*batch_size*max_document_length返回的是一个标量,而非预期中的列表
动态获得每个序列的长度
def length(sequences):
used = tf.sign(tf.reduce_max(tf.abs(sequences), axis=2))
seq_len = tf.reduce_sum(used, axis=1)
return tf.cast(seq_len, tf.int32)
import tensorflow as tf
import numpy as np
tf.reset_default_graph()
t = np.random.randn(3, 10, 8)
X = tf.placeholder(shape=[None,None,None],dtype=tf.float32)
used = tf.sign(tf.reduce_max(tf.abs(X), axis=2))
seq_len = tf.reduce_sum(used, axis=1)
test = tf.cast(seq_len, tf.int32)
with tf.Session() as sess:
print(sess.run(used,feed_dict={X:t}))
print(sess.run(seq_len,feed_dict={X:t}))
print(sess.run(test,feed_dict={X:t}))
[[1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
[1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]
[1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]]
[10. 10. 10.]
[10 10 10]
理解sequence_length参数
本文深入解析了sequence_length参数在深度学习中的作用,特别是在处理不同长度序列数据时的重要性。通过实例展示了如何动态获取序列长度,避免全序列假设带来的计算浪费。
1986

被折叠的 条评论
为什么被折叠?



