-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"requirements" should document that libh5bshuf.so is required for certain HDF5 files #27
Comments
@KayDiederichs thanks for flagging this - it should not be needed - should be compiled in. I now have an M1 mac so will push back up the to-do Also see does this extra line make it work for you? I am also aware @ndevenish has a cmake build coming - #26 (this repo could do with some attention) |
Thanks, Graeme. I tried the extra -noshlib option on x86_64 Linux but it makes no difference - /usr/local/hdf5/lib/plugin/libh5bshuf.so is still needed . |
I’ll try to have a look at this today. Are there [XDS] public M1 Mac builds yet? @graeme-winter i believe you have a test copy you can forward me otherwise? |
Appears to already be online at https://xds.mr.mpg.de/html_doc/downloading.html |
Ahh, Iooked there but was being dumb and read "(emulated on apple silicon)" and just stopped. |
@KayDiederichs, I can't reproduce this with our datasets. The reason that we don't believe the plugin should normally be required appears to be that - if it can - durin reads the data chunks directly and uses internal bitshuffle to manually decompress. That appears to be controlled here: Line 878 in 5d0b7bd
which appears to be gated behind a check for file structure layout: if (H5Lexists(visit_result->nxdetector, "data_000001", H5P_DEFAULT) > 0) {
ds_prop_func = &get_dectris_eiger_dataset_dims;
} else if (H5Lexists(visit_result->nxdetector, "data", H5P_DEFAULT) > 0) {
ds_prop_func = &get_nxs_dataset_dims;
} else if (H5Lexists(visit_result->nxdata, "data_000001", H5P_DEFAULT) > 0) {
ds_prop_func = &get_dectris_eiger_dataset_dims;
} else if (H5Lexists(visit_result->nxdata, "data", H5P_DEFAULT) > 0) {
ds_prop_func = &get_nxs_dataset_dims;
} else {
ERROR_JUMP(-1, done, "Could not locate detector dataset");
} so - I guess if you have an h5 file that only has a @graeme-winter, does this sound about right? |
thanks for the explanation! Wolfgang and I have been looking at a dataset consisting of xxx_master.h5 and xxx_data_000005.h5 . Durin worked well on my computers (which happen to have a long-forgotten /usr/local/hdf5/lib/plugin/libh5bshuf.so from 2015) but not on his (which don't). Since it took us a few days to realize that this difference is responsible for the failure, it would be good to document it. |
Quick suggestion - to avoid providing separate bshuf filter: |
Ah, I think I was both unclear, and slightly misunderstood. I was referring to the internal structure of the HDF5 file:
I think I thought that
Hmm, this sounds a lot less work than I had anticipated (I thought it would be harder), and it would be nice to just resolve the problem without having to have the filters set up. (alternatively, I think that hdf5plugin (github, anaconda) does a lot of the work to get the plugin set compiling on most regular platforms, so pulling their binaries might work). |
Apple M1 is not a regular platform, and I could not find a libh5bshuf.so for it (and then it must also cooperate with the gcc-12 durin rather than with something compiled with Apple's CLANG compiler). See also issue #24 |
Well, it's a regular platform in conda-forge terms - and https://anaconda.org/conda-forge/hdf5plugin/files has osx-arm64 builds. (well, it's almost a regular platform, it's cross-compiled but works well enough for DIALS). All the conda-forge stuff is, however, compiled with (non-apple) clang, if it is an issue. FWIW on my M1, durin built with the current durin-main-branch I'm not suggesting any of this as a good solution, but if the option is struggling to manually build or trying to use a prebuilt binary, then it seems better than not being able to analyse data. We should definitely try to handle this (common case) better though, here. |
Thanks for pointing to that URL! My google fu didn't find that. I downloaded libh5bshuf.dylib and it works (also) for me. |
@fleon-psi - yes we were discussing this earlier - though as @ndevenish pointed out there are also some "routing" questions between how we deal with Diamond data and more "native" (e.g. DECTRIS file writer) data. |
it took me some time to realize that for some data sets, libh5bshuf.so must be present in /usr/local/hdf5/lib/plugin , or in the directory pointed at by the environment variable HDF5_PLUGIN_PATH .
Could this pls be documented?
I am trying to compile bitshuffle-0.4.2 on M1 Apple but I am having a hard time; seems like it wants to compile for x86_64 Apple ... if somebody could prepare such a library then durin could be used natively on M1 Apple.
The text was updated successfully, but these errors were encountered: