-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stokes convergence in 3D needs lots of oversampling, is generally wonky #45
Comments
I think this is probably what I was seeing in #11 too. I haven't seen any effect of the target-specific machinery since, so it seems it was just an instance of playing around with too many parameters at once. I would recommend closing that in favor of this as a tracking issue. |
Thanks for keeping track of it. I've closed that for now. |
Looking at the density picture (x4 refinement) from #32 (comment): I suspect the geometry derivatives (normal etc.) are insufficiently accurate. |
I looked at the normal errors on the same mesh and it seems like they sort of match the density errors. A few details on what's in that picture:
Some caveats:
|
When computing the oversampled grid, we could (at least for this test) provide it with better geometry information. |
I monkeypatched the normal to just return It might actually be something silly: all those tests run with Bumping them to
Still not quite sure where the convergence order is coming from, but this looks more reasonable. Definitely looks like a geometry issue though. |
Possibly related: pytential/test/test_layer_pot_identity.py Lines 245 to 247 in a09934a
|
The test in main uses
Second order is the expected order here for EDIT: also tried 8x oversampling with the same parameters and got
So yeah, this also seems to be in the same boat of needing huge amounts of oversampling to get anywhere. |
Cf. some initial discussion in #32.
cc @alexfikl
The text was updated successfully, but these errors were encountered: