basic_demo目录下requirements.txt的vllm版本设置有问题 #626
-
vllm最新版本才到0.6.3.post,而且版本不能超过0.6.2,不然运行openai_api_server.py回答问题时会报错TypeError: Unexpected keyword argument 'use_beam_search' 我是用的0.6.2版本,目前没问题 |
Beta Was this translation helpful? Give feedback.
Answered by
sixsixcoder
Nov 1, 2024
Replies: 2 comments
-
这是最新版本的vllm删除了这个字段,在openai_api_server.py调用的模型文件中删除这个字段即可 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
zRzRzRzRzRzRzR
-
现在这个问题已经解决,请查看最新代码 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
这是最新版本的vllm删除了这个字段,在openai_api_server.py调用的模型文件中删除这个字段即可
use_beam_search
,我们将尽快修复这个bug