pytorch torch.nn.TransformerEncoderLayer

API

CLASS torch.nn.TransformerEncoderLayer(d_model, nhead, dim_feedforward=2048, dropout=0.1, activation='relu')

TransformerEncoderLayer is made up of self-attn and feedforward network.

参数 描述
d_model the number of expected features in the input (required).
nhead the number of heads in the multiheadattention models (required).
dim_feedforward the dimension of the feedforward network model (default=2048).
dropout the dropout value (default=0.1).
activation the activation function of intermediate layer, relu or gelu (default=relu).
encoder_layer = nn.TransformerEncoderLayer(d_model=512, nhead=8)
src = torch.rand(10, 32, 512)
out = encoder_layer(src)

参考:
https://pytorch.org/docs/master/generated/torch.nn.TransformerEncoderLayer.html#torch.nn.TransformerEncoderLayer

你可能感兴趣的:(Python,python)