Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dolfinx 0.9.0 #64

Draft
wants to merge 9 commits into
base: main
Choose a base branch
from
Draft

Dolfinx 0.9.0 #64

wants to merge 9 commits into from

Conversation

srosenbu
Copy link
Member

@srosenbu srosenbu commented Nov 7, 2024

This PR will add

Resolves #50, #47

@srosenbu
Copy link
Member Author

srosenbu commented Nov 8, 2024

@pdiercks , Do you have any good ideas on how we can test the MPI functionality?
Ideally we should test if the solution fields and quadrature fields are identical for some Problem that has been solved with a different number of MPI processes.

Can we try to write a pytest case where we solve a distributed Problem with MPI.COMM_WORLD and on one of the ranks the same problem with MPI.COMM_SELF? That way we could produce an output within one test case without writing any results to the disk. But I don't know if we can mix the comms. For example would the following work?:

mesh_distr = df.mesh.create_unit_cube(MPI.COMM_WORLD,10,10,10)
mesh_local = df.mesh.create_unit_cube(MPI.COMM_SELF,10,10,10)

space_distr = df.fem.functionspace(mesh_distr, ("Lagrange",1))
space_local = df.fem.functionspace(mesh_local, ("Lagrange",1))

u_distr = df.fem.Function(space_distr)
u_local = df.fem.Function(space_local)

diff = u_distr - u_local

@srosenbu
Copy link
Member Author

srosenbu commented Nov 8, 2024

I tried it. Does not work with different communicators. I will try https://gitlab.com/dglaeser/fieldcompare

@pdiercks , Do you have any good ideas on how we can test the MPI functionality? Ideally we should test if the solution fields and quadrature fields are identical for some Problem that has been solved with a different number of MPI processes.

Can we try to write a pytest case where we solve a distributed Problem with MPI.COMM_WORLD and on one of the ranks the same problem with MPI.COMM_SELF? That way we could produce an output within one test case without writing any results to the disk. But I don't know if we can mix the comms. For example would the following work?:

mesh_distr = df.mesh.create_unit_cube(MPI.COMM_WORLD,10,10,10)
mesh_local = df.mesh.create_unit_cube(MPI.COMM_SELF,10,10,10)

space_distr = df.fem.functionspace(mesh_distr, ("Lagrange",1))
space_local = df.fem.functionspace(mesh_local, ("Lagrange",1))

u_distr = df.fem.Function(space_distr)
u_local = df.fem.Function(space_local)

diff = u_distr - u_local

@srosenbu srosenbu linked an issue Nov 13, 2024 that may be closed by this pull request
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Usage of fem.Expression compared to fem.Function.interpolate MPI functionality
1 participant