SimAlign使用踩坑记录

最近在做无监督词对齐的相关任务,用到了一个开源代码SimAlign
SimAlign是一个通过无监督的方式进行词对齐的工具,其具体的实现方式在它的论文中有具体介绍。

按照官网给出的建议配置环境(Python 3.7, Transformers 3.1.0, Torch 1.5.0. Networkx 2.4)后,安装SimAlign:

pip install --upgrade git+https://github.com/cisnlp/simalign.git#egg=simalign

运行示例代码:

from simalign import SentenceAligner

# making an instance of our model.
# You can specify the embedding model and all alignment settings in the constructor.
myaligner = SentenceAligner(model="bert", token_type="bpe", matching_methods="mai")

# The source and target sentences should be tokenized to words.
src_sentence = ["This", "is", "a", "test", "."]
trg_sentence = ["Das", "ist", "ein", "Test", "."]

# The output is a dictionary with different matching methods.
# Each method has a list of pairs indicating the indexes of aligned words (The alignments are zero-indexed).
alignments = myaligner.get_word_aligns(src_sentence, trg_sentence)

for matching_method in alignments:
    print(matching_method, ":", alignments[matching_method])

# Expected output:
# mwmf (Match): [(0, 0), (1, 1), (2, 2), (3, 3), (4, 4)]
# inter (ArgMax): [(0, 0), (1, 1), (2, 2), (3, 3), (4, 4)]
# itermax (IterMax): [(0, 0), (1, 1), (2, 2), (3, 3), (4, 4)]

本应得到上述输出,但是报错:

Traceback (most recent call last):
  File "align.py", line 13, in <module>
    alignments = myaligner.get_word_aligns(src_sentence, trg_sentence)
  File "/home/users/miniconda3/envs/simalign/lib/python3.7/site-packages/simalign/simalign.py", line 209, in get_word_aligns
    vectors = self.embed_loader.get_embed_list([src_sent, trg_sent]).cpu().detach().numpy()
  File "/home/users/miniconda3/envs/simalign/lib/python3.7/site-packages/simalign/simalign.py", line 62, in get_embed_list
    inputs = self.tokenizer(sent_batch, is_split_into_words=True, padding=True, truncation=True, return_tensors="pt")
  File "/home/users/miniconda3/envs/simalign/lib/python3.7/site-packages/transformers/tokenization_utils_base.py", line 1954, in __call__
    **kwargs,
  File "/home/users/miniconda3/envs/simalign/lib/python3.7/site-packages/transformers/tokenization_utils_base.py", line 2139, in batch_encode_plus
    **kwargs,
  File "/home/users/miniconda3/envs/simalign/lib/python3.7/site-packages/transformers/tokenization_utils.py", line 531, in _batch_encode_plus
    ids, pair_ids = ids_or_pair_ids
ValueError: too many values to unpack (expected 2)

这个情况是由于transformers的版本造成的,卸载transformers,安装最新版本的transformers(我安装的是4.16.2)就解决了。

pip uninstall transformers
pip install transformers

得到的输出如下:
在这里插入图片描述

评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值