Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warning appearing: A high-performance Open MPI point-to-point messaging module ... #29

Open
flawmop opened this issue Jun 1, 2023 · 3 comments

Comments

@flawmop
Copy link

flawmop commented Jun 1, 2023

I'm seeing the following (seemingly inconsequential, as the simulation seems to complete) "warning" appearing in ApPredict https://github.com/Chaste/ApPredict/releases/tag/v2021.1 stderr in a container.

--------------------------------------------------------------------------
[[60025,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:

Module: OpenFabrics (openib)
  Host: 466e13a9963f

Another transport will be used instead, although this may result in
lower performance.

NOTE: You can disable this warning by setting the MCA parameter
btl_base_warn_component_unused to 0.
--------------------------------------------------------------------------

Simulation was run using :

* model = shannon_wang_puglisi_weber_bers_2004_model_updated
* na: no drug effect
* cal: no drug effect
* herg IC50s = 100  uM, Hills = 1, Saturation levels = 0  %.
* iks: no drug effect
* ik1: no drug effect
* ito: no drug effect
* nal: no drug effect
* max free plasma concentration = 100 uM
* min free plasma concentration = 0 uM
* number of plasma concentrations = 13

It'd be an improvement if this wasn't appearing in the logs, so are there any suggestions for adjusting https://github.com/CardiacModelling/appredict-docker/blob/master/appredict-no-emulators/Dockerfile#L40-L47 to include that btl_base_warn_component_unused instruction?

@mirams
Copy link
Member

mirams commented Jun 20, 2023

I think this is unavoidable and harmless, does that sound right to you @jmpf ?

@jmpf
Copy link

jmpf commented Jun 20, 2023

This is always run sequentially, and never in parallel, right?

@mirams
Copy link
Member

mirams commented Jun 20, 2023

Yep

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants