We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
在使用流式输出的时候,如果LLM返回toolcall,会导致工具调用的参数为空
static async Task Main(string[] args) { var chatModel = new OpenAiLatestFastChatModel(new OpenAiProvider(openaiApiKey, openAiBaseUrl)); var calls = new Dictionary<string, Func<string, CancellationToken, Task<string>>> { { "query_document_tool", QueryDocument }, { "query_document_tool_pro", QueryDocument } }; var tools = new List<Tool>(); var tool = new Tool { Name = "query_document_tool", Description = "当从给定的上下文中,无法回答用户的问题的时候,调用该工具来查询文档,通常一个问题如何回答不了就两个工具都调用", Parameters = new OpenApiSchema { Type = "object", Properties = new Dictionary<string, OpenApiSchema> { ["query_content"] = new OpenApiSchema { Type = "string", Description = "需要查询的内容", } }, Required = ["query_content"], }, }; tools.Add(tool); ChatRequest chatRequest = new ChatRequest() { Messages = new List<Message>() { "你是一个知识库问答助手,你只能通过使用工具获取到的信息进行问题的回答,不能编造内容".AsSystemMessage(), "我可以通过pdf服务器将html转为pdf吗?".AsHumanMessage() } }; chatModel.AddGlobalTools(tools, calls); await foreach (var item in chatModel.GenerateAsync(chatRequest, new ChatSettings() { UseStreaming = true })) { } } private static async Task<string> QueryDocument(string arg1, CancellationToken token) { return "没有找到相关资料"; } } 如上代码 ,QueryDocument方法 收到的arg1为一个空字符串,但是如果UseStreaming=false则正常; 我跟了一下代码 ,在OpenAiChatModel.cs的流式输出 部分没有处理工具 相关内容导致的;
OpenAiChatModel.cs的206行,这样处理后 ,可以正常了 ,但是我不清楚是否符合你们的代码习惯
if (usedSettings.UseStreaming == true) { var enumerable = provider.Api.Chat.CreateChatCompletionAsStreamAsync( chatRequest, cancellationToken).ConfigureAwait(false);
var stringBuilder = new StringBuilder(capacity: 1024); await foreach (CreateChatCompletionStreamResponse streamResponse in enumerable) { var choice = streamResponse.Choices.ElementAtOrDefault(0); var streamDelta = choice?.Delta; var delta = new ChatResponseDelta { Content = streamDelta?.Content ?? string.Empty, }; if (streamDelta?.ToolCalls != null) { if (toolCalls == null) { toolCalls = streamDelta.ToolCalls?.Select(x => new ChatToolCall { Id = x.Id ?? string.Empty, ToolName = x.Function?.Name ?? string.Empty, ToolArguments = x.Function?.Arguments ?? string.Empty, }).ToList(); } else { toolCalls[0].ToolArguments += streamDelta?.ToolCalls[0]?.Function?.Arguments; } } usage ??= GetUsage(streamResponse); finishReason ??= choice?.FinishReason switch { CreateChatCompletionStreamResponseChoiceFinishReason.Length => ChatResponseFinishReason.Length, CreateChatCompletionStreamResponseChoiceFinishReason.Stop => ChatResponseFinishReason.Stop, CreateChatCompletionStreamResponseChoiceFinishReason.ContentFilter => ChatResponseFinishReason.ContentFilter, CreateChatCompletionStreamResponseChoiceFinishReason.FunctionCall => ChatResponseFinishReason.ToolCalls, CreateChatCompletionStreamResponseChoiceFinishReason.ToolCalls => ChatResponseFinishReason.ToolCalls, _ => null, }; OnDeltaReceived(delta); stringBuilder.Append(delta.Content); yield return new ChatResponse { Messages = messages, UsedSettings = usedSettings, Delta = delta, Usage = Usage.Empty, }; } OnDeltaReceived(new ChatResponseDelta { Content = Environment.NewLine, }); stringBuilder.Append(Environment.NewLine); Message newMessage = default; if (toolCalls != null) { newMessage = toolCalls[0].ToolArguments.AsToolCallMessage(toolCalls[0].ToolName + ":" + toolCalls[0].Id); } else { newMessage = stringBuilder.ToString().AsAiMessage(); } messages.Add(newMessage); }
No response
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Describe the bug
在使用流式输出的时候,如果LLM返回toolcall,会导致工具调用的参数为空
Steps to reproduce the bug
Expected behavior
if (usedSettings.UseStreaming == true)
{
var enumerable = provider.Api.Chat.CreateChatCompletionAsStreamAsync(
chatRequest,
cancellationToken).ConfigureAwait(false);
Screenshots
No response
NuGet package version
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: