-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reading in a large detector from json is slow #592
Comments
Something I do not entirely get though, is that on lines 95 and 96 of |
If I am not fully mistaken @cg-laser was the one to introduce tiny_db in here. Maybe it really is too tiny for what we are doing here? |
@anelles, I do not think so. The RNO-G detector class uses a custom buffering "strategy", I do not think that this is transferable to this detector. |
TinyDB is still used when giving a json object to the detector class. I second your suggestion Sjoerd. It seems to efficiently solve the problem. |
As a development on this issue, when loading a complete detector description the initialisation is much faster. So I think this is particular issue when trying to load a GenericDetector with lots of references, which is where I would guess it spends most of its time. |
Reading in a large detector description from a
.json
file is slower than expected; for the LOFAR detector description (O(1000) channels), reading in the json description usingtakes O(1 minute). If instead we read in the json first, and then load the detector from the dictionary, this only takes O(1 s):
I'm not sure how the first case is currently implemented - most time is spent in
tinydb_serialization._decode_deep
andjson.decoder.raw_decode
. However, seeing as the second case is (?) equivalent to the first, we should probably change the implementation to just internally read in the json as a dictionary first, and load the detector from that.The text was updated successfully, but these errors were encountered: