Traceback (most recent call last):
File "/home/yaotang.cq/CogVLM2/finetune_demo/peft_infer.py", line 118, in <module>
outputs = model.generate(**inputs, **gen_kwargs)
File "/home/yaotang.cq/.conda/envs/py10vlm/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/yaotang.cq/.conda/envs/py10vlm/lib/python3.10/site-packages/transformers/generation/utils.py", line 2024, in generate
result = self._sample(
File "/home/yaotang.cq/.conda/envs/py10vlm/lib/python3.10/site-packages/transformers/generation/utils.py", line 3032, in _sample
model_kwargs = self._update_model_kwargs_for_generation(
File "/home/yaotang.cq/.cache/huggingface/modules/transformers_modules/cogvlm2-llama3-chinese-chat-19B/modeling_cogvlm.py", line 739, in _update_model_kwargs_for_generation
model_kwargs["past_key_values"] = self._extract_past_from_model_output(
TypeError: GenerationMixin._extract_past_from_model_output() got an unexpected keyword argument 'standardize_cache_format'
TypeError: GenerationMixin._extract_past_from_model_output() got an unexpected keyword argument 'standardize_cache_format'。
修复:
transformers 版本问题
pip install transformers==4.44.2 -i https://mirrors.aliyun.com/pypi/simple/