-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support async iteration #38
base: main
Are you sure you want to change the base?
Conversation
b64a9c4
to
030451c
Compare
d282960
to
9f6fa92
Compare
618c498
to
c5af9c7
Compare
just a drive by comment: this is cool - would love to use this in a Starlette app. i will build some prototypes off this branch to get a feel for it, but i think asyncio support would be very useful for me. |
@raisjn cool, thanks for dropping the comment!
|
i'm still playing with htpy (cool!) and async iteration. the code looks fine to me, but will report back after more days (weeks?) of testing and trying to build the below functionality: comments on async delivery: from what i can tell, it resolves the work in sequential order. what i am aiming for is pipelined rendering (also known as partial pre-rendering in next.js). I believe pipelined delivery can be built on top of (or alongside) htpy's async rendering by using some JS + an async queue. it would look something like this:
|
Interesting! I think just sending CSS/JS before the full page is rendered gives a good boost for free for anyone. Streaming multiple parts of the page and then recombining it with javascript sounds interesting! It feels like there should be a small lib for that. It feels simple in theory! 😆 |
i agree - this is a great boost! i'm not 100% sure using
the exact API doesn't matter as long as it is explicit, it could also be a special tag ( |
why would a explicit flush function/tag be useful? currently with this PR, everything will be "flushed" as soon as it is ready anyways. |
coming back to this: it turns out i don't need async iterators in htpy, have architected an app with async delivery that allows parallel pipelining instead of sequential and uses htpy without async. in my first reading of this PR, i had assumed async iter was in parallel. (collect all async in one pass, then await them in parallel, then do next set, etc). re: why use explicit flush: i think it would help for people who aren't aware of the mechanics of rendering and using async generator vs sync generator. |
This PR adds the possibility to use async awaitables/iterators/generators to generate a response. A sample Starlette example is added too. Thoughts? Ideas?