-
-
Notifications
You must be signed in to change notification settings - Fork 728
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
use non blocking json parser #132
Comments
I'm not against it at all :) Typically everyone uses the default limit value in the parsers here and JSON.parse for that limit takes well under 1ms to parse, so it's typically not an issue. If you want to put together a PR to implement this (and perhaps get urlencoded parser to be streaming too in that PR) that would be awesome, and the best way to move this forward :) |
I've kind of ditched the idea of parsing big json from an api and went with a different approach. In the meantime, maybe we can discuss it a little here for the sake of having some reference
|
|
I've never used any, so I have no preference. If you can find one that is well-supported, that would be best :)
Yea, it would probably be slower, what we need to do would need to be determined by benchmarks. Also, be careful, as just keying a switch off |
I have done research on this topic and it seems this topic is already discussed on node nodejs/node#2031 (comment). To my understanding what are suggesting on doing is not feasible. Did I miss something? |
Good find @ghassanmas, that comment gets to the point well, and IMO probably a good indication that this module should probably not take an opinion on this directly. Offering an interface to override the JSON parse implementation (and then supporting async versions of the parse) is a good idea, but directly implementing a specific one (even if it is the one recommended in that comment) is probably not a good long term solution. |
I would like to see what is the recommended direction here:
so the second point necessitates the first, and the first is already discarded as an option. So? should we conclude and close this as |
Just a bit of a left field solution: I've stumbled upon https://github.com/bjouhier/i-json, which can handle json chunks directly from a stream, instead of concatenating the text and then deciding what to do with it. It's not an easy lift, but could be pretty interesting. I think I'll do a small POC to check feasability and performance overhead. |
just so that we are here - if you are going to do a POC, can I request to test with this yieldable-json as well?
|
Update on my POC:
Tests were performed using ApacheBench against a hello-world basic API on Node 8 (due to the i-json package), creating 100k requests with json bodies of 10kb and 100kb, at a concurrency of 100 I'm not sure if that's an indication that problems only happen on extremely large JSON bodies- which to me is a rare edge-case. I did notice a possible area for improvement though, body parsing as a middleware typically happens before routing, which means that POST requests which would result in a 404 still go through body-parsing. @wesleytodd I'm not sure if that's significant. |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
I was reading Express docs and found this, I think this is at least a great discussion around interesting topic, so let me try to add my small contribution. First of all, we need to decide which approach or strategy to implement a non-blocking parser. My ideas are: Limit the amount of time the parser can useStart the parser stopping it after a short period of time, keep the state for the next cycle of parsing. Repeat this until the stream is completely parsed. Parse it in a different thread or processStarting a different thread or process, parsing the stream and then resolving the value back to the original caller. A single process with its own event loop can be used in this case, so no one needs to deal with several threads running at the same time. The actual parserThe parser could be built extending the actual Node.js JSON parser, if this is possible no one needs to build a parser from scratch. I personally think that the incremental option is more viable, since no one needs to deal with a second running thread or process and IPC. The strategy now should be building PoCs of some different approaches and comparing them against each other, both in performance and maintainability. |
I've published https://github.com/rawpixel-vincent/yieldable-json-body-parser can be implemented as drop in replacement of this is a fresh so there's possibly issues that haven't been found yet, use with caution - suggestions welcome |
Is there an option to use a non blocking (streaming?) json parser ?
The problem with JSON.parse is that it's blocking the event loop and for big json parsing it can be a problem
The text was updated successfully, but these errors were encountered: