Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support async iteration #38

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft

Support async iteration #38

wants to merge 2 commits into from

Conversation

pelme
Copy link
Owner

@pelme pelme commented Jul 31, 2024

This PR adds the possibility to use async awaitables/iterators/generators to generate a response. A sample Starlette example is added too. Thoughts? Ideas?

@pelme pelme force-pushed the aiter-node branch 2 times, most recently from b64a9c4 to 030451c Compare August 10, 2024 19:38
tests/test_async.py Outdated Show resolved Hide resolved
htpy/__init__.py Outdated Show resolved Hide resolved
@raisjn
Copy link

raisjn commented Nov 9, 2024

just a drive by comment: this is cool - would love to use this in a Starlette app. i will build some prototypes off this branch to get a feel for it, but i think asyncio support would be very useful for me.

@pelme
Copy link
Owner Author

pelme commented Nov 9, 2024

@raisjn cool, thanks for dropping the comment!

  • I feel like everyone should use streaming (sync or async) for improved user experience and loading times with minimal effort. It is truly a underused thing in current web development that could give a nice boost in web performance. I am also thinking of adding a Starlette HtpyResponse/Django HtpyResponse/Flask HtpyResponse classes etc which is based on the respective StreamingResponse classes. Just to lower the friction and make it as straightforward as possible to use streaming. Now "streaming" is kind of "opt-in" if you use the StreamingResponse classes etc, I would like to make it easier+make the main path to using htpy be streaming by default.

  • I think this PR is pretty much in a good shape of being merged. I have re-written the implementation/tests in multiple times and think it is pretty solid/well tested at this point. Any review/feedback on the implementation would be very welcome though! I have not merged it yet because a) I am not using async myself in my day-to-day project and b) there have not been any visible feedback/interest in it yet.

  • @raisjn if you would build some prototype and play with it and get back with feedback that would be very valuable and we could get this going! 🙂

@raisjn
Copy link

raisjn commented Nov 10, 2024

i'm still playing with htpy (cool!) and async iteration. the code looks fine to me, but will report back after more days (weeks?) of testing and trying to build the below functionality:

comments on async delivery:

from what i can tell, it resolves the work in sequential order. what i am aiming for is pipelined rendering (also known as partial pre-rendering in next.js). I believe pipelined delivery can be built on top of (or alongside) htpy's async rendering by using some JS + an async queue.

it would look something like this:

div[
  Placeholder(do_some_work()),
  Placeholder(do_some_work())
]

Placeholder would emit a placeholder div that will be later filled with the actual async work when it finishes. this seems like it is straightforward for me to implement with the streaming functionality you've built (so thank you!)

@pelme
Copy link
Owner Author

pelme commented Nov 10, 2024

Interesting! I think just sending CSS/JS before the full page is rendered gives a good boost for free for anyone. Streaming multiple parts of the page and then recombining it with javascript sounds interesting! It feels like there should be a small lib for that. It feels simple in theory! 😆

@raisjn
Copy link

raisjn commented Nov 10, 2024

I think just sending CSS/JS before the full page is rendered gives a good boost for free for anyone

i agree - this is a great boost! i'm not 100% sure using async iter is a good answer for this functionality as it would end up being a hidden implementation detail instead of an explicit consideration by devs: consider adding a special function like flush().

html[
head[
  link[...]
  script[...]
],
flush(),
body[
  div[foobar]
],
flush(),
some_extra_work()

the exact API doesn't matter as long as it is explicit, it could also be a special tag (early_flush[some_html_code]).

@pelme
Copy link
Owner Author

pelme commented Nov 12, 2024

why would a explicit flush function/tag be useful? currently with this PR, everything will be "flushed" as soon as it is ready anyways.

@raisjn
Copy link

raisjn commented Nov 21, 2024

coming back to this: it turns out i don't need async iterators in htpy, have architected an app with async delivery that allows parallel pipelining instead of sequential and uses htpy without async. in my first reading of this PR, i had assumed async iter was in parallel. (collect all async in one pass, then await them in parallel, then do next set, etc).

re: why use explicit flush: i think it would help for people who aren't aware of the mechanics of rendering and using async generator vs sync generator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants