Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming text responses #27

Open
sixlive opened this issue Oct 16, 2024 · 4 comments
Open

Streaming text responses #27

sixlive opened this issue Oct 16, 2024 · 4 comments
Assignees
Labels
enhancement New feature or request planned Item planned on the roadmap

Comments

@sixlive
Copy link
Contributor

sixlive commented Oct 16, 2024

Description

Enabling streaming of generated text, which will be particularly useful for real-time applications and improved user experiences.

@sixlive sixlive added the enhancement New feature or request label Oct 16, 2024
@sixlive sixlive self-assigned this Oct 16, 2024
@sixlive sixlive moved this to Todo in Prism Development Oct 16, 2024
@sixlive sixlive added the planned Item planned on the roadmap label Oct 16, 2024
@petervandijck
Copy link

I would like to second that this is a must-have for even starting to use the package 👍

@sixlive sixlive changed the title Streaming Text Response Streaming text responses Oct 25, 2024
@tognee
Copy link

tognee commented Nov 18, 2024

Hi, is there a timeframe for this feature?
I could help to develop it if there is a clear definition on how it should be implemented.

The first thing that needs to be discussed is if the streaming option should be an extension of the Prism::text static function or a new Prism::streamingText function.

After that the implementation seems simple, there should be a generate function that calls the API endpoint with the streaming parameter and returns a generator. This generator should collect the data from the request and parses the SSE events coming from the provider, map each chunk to a new ChunkResponse object and yield the object one at a time.

This ChunkResponse should be following the OpenAI Chunk type or should it be something custom?

If we can get to a specification together I could start working on this feature

@tomtev
Copy link

tomtev commented Nov 23, 2024

Really need this also :)

@petervandijck
Copy link

petervandijck commented Nov 23, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request planned Item planned on the roadmap
Projects
Status: Todo
Development

No branches or pull requests

4 participants