Lightweight HDF5 polymorphic Fortran: h5write() h5read()
use HDF5 built-in casting real32 <=> real64 and int32 <=> int64. We no longer manually cast int <=> real to reduce code complexity and memory usage. This also helps avoid inefficient user code where type casting int <=> real was accidental.
The gain in efficiency and reduction in code complexity outweighs the small and easily fixed user use cases that used int <=> real casting. The user will have to assign appropriate variable types to match disk datatypes. Again, real32 <=> real64 and int32 <=> int64 is fine and handled inside the HDF5 library.
About 1,000 lines of code were deduplicated/eliminated, one of the largest reductions in this project's history.
Added CI or local code coverage test with GCC + Gcovr. As a result, realized that pathlib module was unused, and removed pathlib.
CI tests HDF5 build in static and shared across Linux, MacOS, Windows
put Rpath in exe for shared builds
read character: automatically truncate to c_null_char if present this avoids the user having to do this themself. Before this, if the user string variable was longer than the disk string variable the remainder of the user variable could be filled with garbage non-blank characters.
default install to top-level project binary dir this makes the default install directory visible even when used from FetchContent
use CMake generator expressions to simplify / clarify logic
enhance Intel CI tests to be debug and release
Since h5cc compiler wrappers are optional, search only HDF5_ROOT when specified, otherwise Anaconda wrappers may get picked up, causing link to fail.
Fortran module search is first under HDF5 C include directory to handle case where primary HDF5 library was not compiled with Fortran interface, which would usually cause link failure. If HDF5_ROOT specified, search Fortran hdf5.mod only under that.
conda HDF5 conflicts with compilers typically, yet is first on PATH by design. We ignore Conda for the FindHDF5, FindZLIB only to allow the rest of the project to use Python.
Previously, on MacOS Anaconda Python could get in the way of finding an ABI-compatible HDF5. This update looks for Homebrew HDF5 first.
Also, find HDF5 better when only the parallel HDF5 is installed but not specified.