Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
So, I know that having 10k objects in a directory isn't a good idea! :-) But I've got to deal with this server/application as it is for the time being.
The Cloud Files limitation is 10k per result set. My branch changes list_directory() to request the next set of objects if it gets exactly 10k results.
It does this by moving most of the functionality of list_directory() into list_directory_internal() with a slightly changed interface (the return value is the number of objects in the result set, and dir_list isn't cleared, it's used as the last item returned). Then in list_directory() it calls list_directory_internal(), looping if necessary.
It's been a long time since I've written any C code, so I hope didn't do anything really stupid in there! In any case, go easy on me. ;-)
Best regards,
David.