自注意力机制(Self-Attention)
自注意力机制代码(pytorch版):import torchfrom torch import nnclass SelfAttention(nn.Module): """ self attention module""" def __init__(self, in_dim): super(SelfAttention, self).__init__() self.chanel_in = in_dim self.query = nn
原创
2022-01-30 21:00:02 ·
1687 阅读 ·
0 评论