NLP进阶,Bert+BiLSTM情感分析实战

class bert_lstm(nn.Module):

def init(self, bertpath, hidden_dim, output_size,n_layers,bidirectional=True, drop_prob=0.5):

super(bert_lstm, self).init()

self.output_size = output_size

self.n_layers = n_layers

self.hidden_dim = hidden_dim

self.bidirectional = bidirectional

#Bert ----------------重点,bert模型需要嵌入到自定义模型里面

self.bert=BertModel.from_pretrained(bertpath)

for param in self.bert.parameters():

param.requires_grad = True

LSTM layers

self.lstm = nn.LSTM(768, hidden_dim, n_layers, batch_first=True,bidirectional=bidirectional)

dropout layer

self.dropout = nn.Dropout(drop_prob)

linear and sigmoid layers

if bidirectional:

self.fc = nn.Linear(hidden_dim*2, output_size)

else:

self.fc = nn.Linear(hidden_dim, output_size)

#self.sig = nn.Sigmoid()

def forward(self, x, hidden):

batch_size = x.size(0)

#生成bert字向量

x=self.bert(x)[0] #bert 字向量

lstm_out

#x = x.float()

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值