Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change OpenAI default mode to streaming #7

Open
HavenDV opened this issue Jan 25, 2024 · 0 comments
Open

Change OpenAI default mode to streaming #7

HavenDV opened this issue Jan 25, 2024 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@HavenDV
Copy link
Contributor

HavenDV commented Jan 25, 2024

Anti — 01/22/2024 1:05 PM
@HavenDV Is it possible to add TokenGenerated event into OpenAI provider? Maybe we should make those events to be part of the common interface?
HavenDV — 01/23/2024 2:48 AM
It's possible, but it will only work in streaming mode - https://platform.openai.com/docs/api-reference/chat/create#chat-create-stream
And I'm not sure if we should use this as the default
Anti — 01/23/2024 11:45 AM
streaming mode is not slower than regular one. and it looks better when you see the output of LLM right away instead of waiting for response for 10 seconds
if you want you can, actually, check if there is any subscribers to the event and pick the mode based on that
oh, and also PromptSent event. This also helps quite a lot with debug
HavenDV — 01/23/2024 10:50 PM
Yes, I think you are right and we need to do this as default behavior

@HavenDV HavenDV added the enhancement New feature or request label Jan 25, 2024
@HavenDV HavenDV self-assigned this Jan 25, 2024
@HavenDV HavenDV transferred this issue from tryAGI/LangChain Aug 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Todo
Development

No branches or pull requests

1 participant