Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conflict of MPI and well model #3519

Open
liushilongpku opened this issue Jan 20, 2025 · 1 comment
Open

Conflict of MPI and well model #3519

liushilongpku opened this issue Jan 20, 2025 · 1 comment
Labels
type: bug Something isn't working type: new A new issue has been created and requires attention

Comments

@liushilongpku
Copy link

liushilongpku commented Jan 20, 2025

Hi everyone! When running with mpirun, the looped wells cannot be opened for the second time. There are no problems while using "mpirun -np 1" or only "geos" for+ computation. The error message is as follows:

InjectionWellControls: well is shut
ProductionWellControls: well is shut
Time: 1.08e+05 s, dt: 1800 s, Cycle: 61

Received signal 15: Terminated

 StackTrace of 20 frames 
Frame 0: /lib/x86_64-linux-gnu/libc.so.6 
Frame 1: /usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi3/mca_btl_vader.so 
Frame 2: opal_progress 
Frame 3: ompi_request_default_wait 
Frame 4: ompi_coll_base_sendrecv_actual 
Frame 5: ompi_coll_base_allreduce_intra_recursivedoubling 
Frame 6: PMPI_Allreduce 
Frame 7: geos::CompositionalMultiphaseReservoirAndWells<geos::CompositionalMultiphaseBase>::assembleCouplingTerms(double, double, geos::DomainPartition const&, geos::DofManager const&, LvArray::CRSMatrixView<double, long long const, int const, LvArray::ChaiBuffer> const&, LvArray::ArrayView<double, 1, 0, int, LvArray::ChaiBuffer> const&)::{lambda(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, geos::MeshLevel const&, LvArray::ArrayView<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, 1, 0, int, LvArray::ChaiBuffer> const&)#1}::operator()(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, geos::MeshLevel const&, LvArray::ArrayView<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, 1, 0, int, LvArray::ChaiBuffer> const&) const::{lambda(int, geos::WellElementSubRegion const&)#1}::operator()(int, geos::WellElementSubRegion const) const 
Frame 8: geos::CompositionalMultiphaseReservoirAndWells<geos::CompositionalMultiphaseBase>::assembleCouplingTerms(double, double, geos::DomainPartition const&, geos::DofManager const&, LvArray::CRSMatrixView<double, long long const, int const, LvArray::ChaiBuffer> const&, LvArray::ArrayView<double, 1, 0, int, LvArray::ChaiBuffer> const&)::{lambda(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, geos::MeshLevel const&, LvArray::ArrayView<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, 1, 0, int, LvArray::ChaiBuffer> const&)#1}::operator()(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, geos::MeshLevel const&, LvArray::ArrayView<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, 1, 0, int, LvArray::ChaiBuffer> const&) const 
Frame 9: geos::CompositionalMultiphaseReservoirAndWells<geos::CompositionalMultiphaseBase>::assembleCouplingTerms(double, double, geos::DomainPartition const&, geos::DofManager const&, LvArray::CRSMatrixView<double, long long const, int const, LvArray::ChaiBuffer> const&, LvArray::ArrayView<double, 1, 0, int, LvArray::ChaiBuffer> const&) 
Frame 10: geos::PhysicsSolverBase::solveNonlinearSystem(double const&, double const&, int, geos::DomainPartition&) 
Frame 11: geos::PhysicsSolverBase::nonlinearImplicitStep(double const&, double const&, int, geos::DomainPartition&) 
Frame 12: geos::PhysicsSolverBase::solverStep(double const&, double const&, int, geos::DomainPartition&) 
Frame 13: geos::CoupledSolver<geos::CompositionalMultiphaseBase, geos::CompositionalMultiphaseWell>::solverStep(double const&, double const&, int, geos::DomainPartition&) 
Frame 14: geos::PhysicsSolverBase::execute(double, double, int, int, double, geos::DomainPartition&) 
Frame 15: geos::EventBase::execute(double, double, int, int, double, geos::DomainPartition&) 
Frame 16: geos::EventManager::run(geos::DomainPartition&) 
Frame 17: geos::GeosxState::run() 
Frame 18: main 
Frame 19: __libc_start_main 
Frame 20: _start 


Here are my case:
mpi_error_case.tar.gz

@liushilongpku liushilongpku added type: bug Something isn't working type: new A new issue has been created and requires attention labels Jan 20, 2025
@tjb-ltk
Copy link
Contributor

tjb-ltk commented Feb 7, 2025

Thanks for including model. Another case showed similar behavior and was fixed in the following pull request.

#3541

I tested your model with 1 and 4 cores, 4 core crashed, but with this PR it ran till the end and a diff of the rate files matched

you can pull that PR and test, or wait for it to be merged. Verify that it works and we can close the issue. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug Something isn't working type: new A new issue has been created and requires attention
Projects
None yet
Development

No branches or pull requests

2 participants