Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VTK 9 API changes are causing compilation errors #118

Open
ktbolt opened this issue Mar 7, 2023 · 3 comments
Open

VTK 9 API changes are causing compilation errors #118

ktbolt opened this issue Mar 7, 2023 · 3 comments
Assignees

Comments

@ktbolt
Copy link
Contributor

ktbolt commented Mar 7, 2023

There were a lot of API changes from VTK 8 to VTK 9. This is causing compilation errors in some VTK calls

svSolver/Code/FlowSolvers/ThreeDSolver/svPre/helpers.cxx: In function ‘int VtkUtils_GetAllPolys(vtkPolyData*, int*, vtkIdType**)’:
svSolver/Code/FlowSolvers/ThreeDSolver/svPre/helpers.cxx:1927:40: error: binding reference of type ‘const vtkIdType*&’ {aka ‘const long long int*&’} to ‘vtkIdType*’ {aka ‘long long int*’} discards qualifiers
 1927 |     while ( pdPgns->GetNextCell( npts, pts ) ) {
      |                                        ^~~

The fix is to add a const declaration, for example change

 vtkIdType *pts;

to

vtkIdType const *pts;

We need to think about how we want to make these changes, maybe add a test for VTK versions.

@ktbolt ktbolt self-assigned this Mar 7, 2023
@ktbolt
Copy link
Contributor Author

ktbolt commented Jan 23, 2024

I've added a directive to test for VTK version >= 9 and selectively define vtkIdType *pts; depending on VTK version

    #ifdef VTK_USE_NEW_ID_TYPE
    vtkIdType const *pts;
    #else
    vtkIdType *pts;
    #endif

Tested on MacOS and Ubuntu.

@lennartvi
Copy link

Hi Dave,

Thanks for working on this! Our HPC environment recently dropped support for VTK8.x, so I had to recompile svSolver for VTK 9.x. It got stuck at one error where VTK attempted an interface link of MPI::MPI_C, which was not set in the CMakeFiles. I got it resolved by defining the MPI_C linking in SimVascularExternals.cmake.

Suggested solution:

In https://github.com/SimVascular/svSolver/blob/master/Code/CMake/SimVascularExternals.cmake#36 before invoking VTK, add an explicit MPI_C interface link.

add_library(MPI::MPI_C INTERFACE IMPORTED)
set_target_properties(MPI::MPI_C PROPERTIES
  INTERFACE_INCLUDE_DIRECTORIES "${MPI_C_INCLUDE_PATH}"
  INTERFACE_LINK_LIBRARIES "${MPI_C_LIBRARIES}"
)

Steps to reproduce and resolve:

OS: RHEL8, GCC 12.3.0, OpenMPI 4.1.5, CMake 3.26.5, mpicc

During the make process, the calling of VTK gave the following error:

CMake Error at /sw/arch/RHEL8/EB_production/2023/software/VTK/9.3.0-foss-2023a/lib64/cmake/vtk-9.3/VTK-targets.cmake:551 (set_target_properties):
  The link interface of target "VTK::mpi" contains:

    MPI::MPI_C

  but the target was not found.  Possible reasons include:

    * There is a typo in the target name.
    * A find_package call is missing for an IMPORTED target.
    * An ALIAS target is missing.

Call Stack (most recent call first):
  /sw/arch/RHEL8/EB_production/2023/software/VTK/9.3.0-foss-2023a/lib64/cmake/vtk-9.3/vtk-config.cmake:145 (include)
  CMake/SimVascularExternals.cmake:40 (find_package)
  CMakeLists.txt:106 (include)

The lines in the vtk-config file that produced the error were

# Create imported target VTK::mpi
add_library(VTK::mpi INTERFACE IMPORTED)

set_target_properties(VTK::mpi PROPERTIES
  INTERFACE_COMPILE_DEFINITIONS "MPICH_SKIP_MPICXX;OMPI_SKIP_MPICXX;MPI_NO_CPPBIND;_MPICC_H"
  INTERFACE_LINK_LIBRARIES "MPI::MPI_C"
)

Perhaps this link usage was new in VTK 9? Either way, after some GPT consulting, I got the error resolved with the mentioned MPI_C interface link in SimVascularExternals.cmake. I suggest to update that. Or perhaps include the link in another CMakeFile, I'm not sure what is good practice.

@ktbolt
Copy link
Contributor Author

ktbolt commented Dec 5, 2024

@lennartvi Thanks for looking into this!

I don't know whyVTK::mpi is needed but if that fixes the build then so be it.

Note that we are planning to terminate svSolver development and will soon archive this repository.

Our primary solver will be svMultiPhysics; See the documentation.

The svMultiPhysics solver can be used with a Docker container so you won't need to build it on your HPC system.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants