TensorFlow函数:tf.nn.dynamic_rnn

tf.nn.dynamic_rnn()

把神经单元按照时间维度展开
即下面图片的实现:
TensorFlow函数:tf.nn.dynamic_rnn_第1张图片
下面看代码:

def dynamic_rnn(cell, 
                inputs, 
                sequence_length=None, 
                initial_state=None,
                dtype=None, 
                parallel_iterations=None, 
                swap_memory=False,
                time_major=False, scope=None):
# 输出:
# 一个(outputs, state)
# outputs:RNN的输出张量
# state:最终的state,值的注意的是,如果使用的是`LSTMCells`,则state是一个tuple.

作用:由指定的RNNcell创建循环神经网络

示例1

# 创建一个 BasicRNNCell
rnn_cell = tf.nn.rnn_cell.BasicRNNCell(hidden_size)

# 输出一个shape=[batch_size, max_time, cell_state_size]的张量

# 定义最初的 state
initial_state = rnn_cell.zero_state(batch_size, dtype=tf.float32)

# 'state'是一个shape = [batch_size, cell_state_size] 的张量
outputs, state = tf.nn.dynamic_rnn(rnn_cell, input_data,
                                 initial_state=initial_state,
                                 dtype=tf.float32)

示例2:

# 创建两个LSTMCells
rnn_layers = [tf.nn.rnn_cell.LSTMCell(size) for size in [128, 256]]

# 由上面的rnn_layers列表创建一个MultiRNNCell
multi_rnn_cell = tf.nn.rnn_cell.MultiRNNCell(rnn_layers)

# 输出是一个 shape = [batch_size, max_time, 256] 的张量
# 'state' 是一个 N-tuple (N是LSTMCells的数量,LSTMCells为每个cell保留着一个tf.contrib.rnn.LSTMStateTuple)
outputs, state = tf.nn.dynamic_rnn(cell=multi_rnn_cell,
                                 inputs=data,
                                 dtype=tf.float32)

你可能感兴趣的:(机器学习)