1
votes

I have some Fortran code I would like to paralelize with MPI. Appereantly, recomended way to use MPI (MPICH, in my case) with Fortran is through mpi_f08 module (mpi-forum entry on the matter), but I have trouble making it work, since corresponding mod file is simply not created (unlike mpi.mod, which works fine, but it's not up to date with Fortran standart). This discussion left me under the impression it's because gfortran can't build the F08 bindings. Below you can see my configuration, both gfortran and mpich have been installed throught apt install on ubuntu and should be up to date. I'm unsure about a few things :

  • Is there any way to make the Fortran 2008 MPI syntax work with gfortran? From what I came across, it seems the answer is no, but hopefully someone may know a fix. I'm not too versed in this, so any relavant links or more entry level explanation would be greatly appreciated.
  • Could using different compiler help? Intel compiler* maybe? I would rather stick with gfortran if reasonable.
  • Maybe consistency with current standart isn't such a big deal. From your experience, would it be better to just go with support through mpi.mod module? What problems could I expect then? My application doesn't have much long term ambition, so falling out of support some time later isn't a big problem if it works properly now.

Edit

It seem's to have been problem of using outdated version of gfortran. This reduces my question to how to build MPICH with gfortran-10.


* hence the [intel-fortran] tag, feel free to remove it if you think it redundant

Just for clarity, there's my gfortran and mpich configuration

pavel@pavel:~$ gfortran -v
Using built-in specs.
COLLECT_GCC=gfortran
COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-linux-gnu/7/lto-wrapper
OFFLOAD_TARGET_NAMES=nvptx-none
OFFLOAD_TARGET_DEFAULT=1
Target: x86_64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=x86_64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-vtable-verify --enable-libmpx --enable-plugin --enable-default-pie --with-system-zlib --with-target-system-zlib --enable-objc-gc=auto --enable-multiarch --disable-werror --with-arch-32=i686 --with-abi=m64 --with-multilib-list=m32,m64,mx32 --enable-multilib --with-tune=generic --enable-offload-targets=nvptx-none --without-cuda-driver --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=x86_64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu 7.5.0-3ubuntu1~18.04) 
pavel@pavel:~$ mpiexec --version
HYDRA build details:
    Version:                                 3.3a2
    Release Date:                            Sun Nov 13 09:12:11 MST 2016
    CC:                              gcc   -Wl,-Bsymbolic-functions -Wl,-z,relro 
    CXX:                             g++   -Wl,-Bsymbolic-functions -Wl,-z,relro 
    F77:                             gfortran  -Wl,-Bsymbolic-functions -Wl,-z,relro 
    F90:                             gfortran  -Wl,-Bsymbolic-functions -Wl,-z,relro 
    Configure options:                       '--disable-option-checking' '--prefix=/usr' '--build=x86_64-linux-gnu' '--includedir=${prefix}/include' '--mandir=${prefix}/share/man' '--infodir=${prefix}/share/info' '--sysconfdir=/etc' '--localstatedir=/var' '--disable-silent-rules' '--libdir=${prefix}/lib/x86_64-linux-gnu' '--libexecdir=${prefix}/lib/x86_64-linux-gnu' '--disable-maintainer-mode' '--disable-dependency-tracking' '--with-libfabric' '--enable-shared' '--enable-fortran=all' '--disable-rpath' '--disable-wrapper-rpath' '--sysconfdir=/etc/mpich' '--libdir=/usr/lib/x86_64-linux-gnu' '--includedir=/usr/include/mpich' '--docdir=/usr/share/doc/mpich' '--with-hwloc-prefix=system' '--enable-checkpointing' '--with-hydra-ckpointlib=blcr' 'CPPFLAGS= -Wdate-time -D_FORTIFY_SOURCE=2 -I/build/mpich-O9at2o/mpich-3.3~a2/src/mpl/include -I/build/mpich-O9at2o/mpich-3.3~a2/src/mpl/include -I/build/mpich-O9at2o/mpich-3.3~a2/src/openpa/src -I/build/mpich-O9at2o/mpich-3.3~a2/src/openpa/src -D_REENTRANT -I/build/mpich-O9at2o/mpich-3.3~a2/src/mpi/romio/include' 'CFLAGS= -g -O2 -fdebug-prefix-map=/build/mpich-O9at2o/mpich-3.3~a2=. -fstack-protector-strong -Wformat -Werror=format-security -O2' 'CXXFLAGS= -g -O2 -fdebug-prefix-map=/build/mpich-O9at2o/mpich-3.3~a2=. -fstack-protector-strong -Wformat -Werror=format-security -O2' 'FFLAGS= -g -O2 -fdebug-prefix-map=/build/mpich-O9at2o/mpich-3.3~a2=. -fstack-protector-strong -O2' 'FCFLAGS= -g -O2 -fdebug-prefix-map=/build/mpich-O9at2o/mpich-3.3~a2=. -fstack-protector-strong -O2' 'build_alias=x86_64-linux-gnu' 'MPICHLIB_CFLAGS=-g -O2 -fdebug-prefix-map=/build/mpich-O9at2o/mpich-3.3~a2=. -fstack-protector-strong -Wformat -Werror=format-security' 'MPICHLIB_CPPFLAGS=-Wdate-time -D_FORTIFY_SOURCE=2' 'MPICHLIB_CXXFLAGS=-g -O2 -fdebug-prefix-map=/build/mpich-O9at2o/mpich-3.3~a2=. -fstack-protector-strong -Wformat -Werror=format-security' 'MPICHLIB_FFLAGS=-g -O2 -fdebug-prefix-map=/build/mpich-O9at2o/mpich-3.3~a2=. -fstack-protector-strong' 'MPICHLIB_FCFLAGS=-g -O2 -fdebug-prefix-map=/build/mpich-O9at2o/mpich-3.3~a2=. -fstack-protector-strong' 'LDFLAGS=-Wl,-Bsymbolic-functions -Wl,-z,relro' 'FC=gfortran' 'F77=gfortran' 'MPILIBNAME=mpich' '--cache-file=/dev/null' '--srcdir=.' 'CC=gcc' 'LIBS=' 'MPLLIBNAME=mpl'
    Process Manager:                         pmi
    Launchers available:                     ssh rsh fork slurm ll lsf sge manual persist
    Topology libraries available:            hwloc
    Resource management kernels available:   user slurm ll lsf sge pbs cobalt
    Checkpointing libraries available:       blcr
    Demux engines available:                 poll select

trying to compile my code with mpif90 leads to

something.f90:2:5:

  use mpi_f08
     1
Fatal Error: Can't open module file ‘mpi_f08.mod’ for reading at (1): File does not exist
compilation terminated.

2
Can you update your version of GCC and rebuild MPICH with it? gfortran 10.2 supports a lot more of Fortran 2008 than 7.5 does.francescalus
I can try. I'll keep you updated.P. Janecek
@francescalus Sadly, that doesn't seem to help, unless I did somethg wrong. How can I check what gfortran verision is MPICH built with?P. Janecek

2 Answers

5
votes

An answer above mentioned TS 29113, which was incorporated to and superseded by the Fortran 2018 standard. In the process of incorporating a TS into a subsequent standard, some of the features described in the TS may change. I don't know specifically what might have changed in this case, but it's safer to refer to the Fortran 2018 standard rather than the TS.

In order to provide mpi_f08, MPICH requires that the compiler install the ISO_Fortran_binding.h header file that is described in the Fortran 2018 standard and was described in TS 29113. gfortran has provided ISO_Fortran_binding.h since version 9 (see https://gcc.gnu.org/gcc-9/changes.html) so I believe MPICH should be installing mph_f08 with gfortran-9 and later, although I believe gfortran's support for ISO_Fortran_binding.h might have required some important bug fixes in a subsequent releases so I recommend using the latest release whenever possible.

3
votes

MPICH requires the Fortran compiler to support the array descriptor of Technical Specification 29113, and this is only supported in recent versions of gfortran (GNU 10 is ok). Intel compilers have been fine for a while fwiw.

Note that Open MPI is not that picky w.r.t. TS 29113 and does not need support for the array descriptor. GNU 7.5 can be used to generate the mpi_f08 module.

Bottom line, you have two options w.r.t. the mpi_f08 Fortran module:

  • use a Fortran support that meets MPICH expectation w.r.t. TS 29113 (e.g. GNU 10, or Intel compilers for example)
  • move to Open MPI