Compiling NwChem 6.8.1 on Ubuntu 18.04 with OpenMPI 4.0 failed


Click here for full thread
Clicked A Few Times
I tried to build NwChem 6.8.1 with OpenMPI 4.0 and OpenBLAS-0.2.20 on Ubuntu 18.04.
At first I renamed functions that renamed in OpenMPI 4:

MPI_Errhandler_set -> MPI_Comm_set_errhandler
MPI_Type_struct -> MPI_Type_create_struct


according to OpenMPI FAQ

Then I replaced flags -i8 to -fdefault-integer-8 in makefiles to use Gfortran-7.

Environment variables was
export USE_NOFSCHECK=TRUE
export TCGRSH=/usr/bin/ssh
export NWCHEM_TOP=$(pwd)
export NWCHEM_TARGET=LINUX64
export NWCHEM_MODULES="all python"
export LARGE_FILES=TRUE
export ENABLE_COMPONENT=yes
export CC=mpicc
export FC=mpifort
#export FOPTIMIZE="-O0 -g -march=native -mtune=native -mavx2 -funroll-loops -fprefetch-loop-arrays -fvariable-expansion-in-unroller -ffast-math"
export FOPTIMIZE="-O3 -march=native -mtune=native -mavx2 -funroll-loops -fprefetch-loop-arrays -fvariable-expansion-in-unroller -ffast-math"
#export COPTIMIZE="-O0 -march=native -mtune=native -mavx2 -funroll-loops -ffast-math"
export COPTIMIZE="-O3 -march=native -mtune=native -mavx2 -funroll-loops -ffast-math"
export HAS_BLAS=yes
export BLAS_SIZE=8
export BLASOPT="-L/opt/OpenBLAS-0.2.20-no_threads/lib -lopenblas"
export USE_MPI=y
export USE_MPIF=y
export USE_MPIF4=y
#export ARMCI_NETWORK=MPI-PR
#export ARMCI_DEFAULT_SHMMAX=2048
export MPI_LOC=/opt/openmpi-4.0
export MPI_LIB=/opt/openmpi-4.0/lib
export MPI_INCLUDE=/opt/openmpi-4.0/include
export LIBMPI="-lpthread -L$MPI_LIB -lmpi -lmpi_usempif08 -lmpi_mpifh -ldl -Wl,--export-dynamic -lnsl -lutil"
export MRCC_METHODS=TRUE
export CCSDTQ=y
export CCSDTLR=y
export IPCCSD=y
export EACCSD=y
export PYTHONHOME=/usr
export PYTHONVERSION=2.7
export PYTHONLIBTYPE=so
export PYTHONCONFIGDIR=config-x86_64-linux-gnu

ARMCI network was detected automatically; see log:

configure: **************************************************************
configure:  Aggregate Remote Memory Copy Interface (ARMCI) configured as follows:
configure: **************************************************************
configure:
configure:                 TARGET=LINUX64
configure:              MSG_COMMS=MPI
configure:             GA_MP_LIBS= -lpthread -lmpi -lmpi_usempif08 -lmpi_mpifh -ldl -lnsl -lutil
configure:          GA_MP_LDFLAGS= -L/opt/openmpi-4.0/lib -L/opt/openmpi-4.0/lib -Wl,--export-dynamic -L/opt/openmpi-4.0/lib
configure:         GA_MP_CPPFLAGS= -I/opt/openmpi-4.0/include -I/opt/openmpi-4.0/include
configure:          ARMCI_NETWORK=MPI_TS
configure:  ARMCI_NETWORK_LDFLAGS=
configure:     ARMCI_NETWORK_LIBS=
configure: ARMCI_NETWORK_CPPFLAGS=
configure:                     CC=mpicc
configure:                 CFLAGS=
configure:             ARMCI_COPT=
configure:                    CPP=mpicc -E
configure:               CPPFLAGS=
configure:                LDFLAGS=
configure:                   LIBS=-lm
configure:                  FLIBS=
configure:                     AR=ar
configure:               AR_FLAGS=cru
configure:                   CCAS=mpicc
configure:             CCAS_FLAGS=
configure:                   DEFS=-DHAVE_CONFIG_H
configure:                  SHELL=/bin/bash
configure:                MPIEXEC=/opt/openmpi-4.0/bin/mpirun -n %NP%
configure:                 NPROCS=4
configure:
configure: **************************************************************
configure:  Communication Runtime for Extreme Scale (comex) configured as follows:
configure: **************************************************************
configure:
configure:               MPI_LIBS= -lpthread -lmpi -lmpi_usempif08 -lmpi_mpifh -ldl -lnsl -lutil
configure:            MPI_LDFLAGS= -L/opt/openmpi-4.0/lib -L/opt/openmpi-4.0/lib -Wl,--export-dynamic -L/opt/openmpi-4.0/lib
configure:           MPI_CPPFLAGS= -I/opt/openmpi-4.0/include -I/opt/openmpi-4.0/include
configure:          COMEX_NETWORK=MPI_TS
configure:  COMEX_NETWORK_LDFLAGS=
configure:     COMEX_NETWORK_LIBS=
configure: COMEX_NETWORK_CPPFLAGS=
configure:                     CC=mpicc
configure:                 CFLAGS=-g -O2
configure:                    CPP=mpicc -E
configure:               CPPFLAGS=
configure:                LDFLAGS=
configure:                   LIBS=-lrt -lm
configure:                  FLIBS=
configure:                     AR=ar
configure:               AR_FLAGS=cru
configure:                   DEFS=-DHAVE_CONFIG_H
configure:                  SHELL=/bin/bash
configure:                MPIEXEC=/opt/openmpi-4.0/bin/mpirun -n %NP%
configure:                 NPROCS=4
configure:
configure:
configure: **************************************************************
configure:  Global Arrays (GA) configured as follows:
configure: **************************************************************
configure:
configure:                 TARGET=LINUX64
configure:              MSG_COMMS=MPI
configure:             GA_MP_LIBS= -lpthread -lmpi -lmpi_usempif08 -lmpi_mpifh -ldl -lnsl -lutil
configure:          GA_MP_LDFLAGS= -L/opt/openmpi-4.0/lib -L/opt/openmpi-4.0/lib -Wl,--export-dynamic -L/opt/openmpi-4.0/lib
configure:         GA_MP_CPPFLAGS= -I/opt/openmpi-4.0/include -I/opt/openmpi-4.0/include
configure:          ARMCI_NETWORK=MPI_TS
configure:  ARMCI_NETWORK_LDFLAGS=
configure:     ARMCI_NETWORK_LIBS=
configure: ARMCI_NETWORK_CPPFLAGS=
configure:                    F77=mpifort
configure:                 FFLAGS=
configure:              FFLAG_INT=-fdefault-integer-8
configure:      FFLAG_NO_LOOP_OPT=-fno-aggressive-loop-optimizations
configure:                GA_FOPT=
configure:                     CC=mpicc
configure:                 CFLAGS=
configure:      CFLAG_NO_LOOP_OPT=-fno-aggressive-loop-optimizations
configure:                GA_COPT=
configure:                    CPP=mpicc -E
configure:               CPPFLAGS=
configure:                LDFLAGS=
configure:                   LIBS=-lm
configure:                  FLIBS= -L/opt/openmpi-4.0/lib -L/usr/lib/gcc/x86_64-linux-gnu/7 -L/usr/lib/gcc/x86_64-linux-gnu/7/../../../x86_64-linux-gnu -L/usr/lib/gcc/x86_64-linux-gnu/7/../../../../lib -L/lib/x86_64-linux-gnu -L/lib/../lib -L/usr/lib/x86_64-linux-gnu -L/usr/lib/../lib -L/usr/lib/gcc/x86_64-linux-gnu/7/../../.. -lgfortran -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lquadmath -lpthread
configure:           BLAS_LDFLAGS= -L/opt/OpenBLAS-0.2.20-no_threads/lib
configure:              BLAS_LIBS= -lopenblas
configure:          BLAS_CPPFLAGS=
configure:                     AR=ar
configure:               AR_FLAGS=cru
configure:                   CCAS=mpicc
configure:             CCAS_FLAGS=
configure:                   DEFS=-DHAVE_CONFIG_H
configure:                  SHELL=/bin/bash
configure:                MPIEXEC=/opt/openmpi-4.0/bin/mpirun -n %NP%
configure:                 NPROCS=4
configure:


NwChem has been compiled sucessfully, but when I tested it by energy optimization of water molecule:

start water
title "water"
charge 0
memory 8192 mb
geometry units angstrom
H         -1.40712       -0.04588       -0.21568
O         -0.73023       -1.27992        0.21522
H          0.67334       -1.24707       -0.22724
end
dft
        iterations 50
        print  kinetic_energy
        xc b3lyp
end
basis
        * library 6-31g
end
task dft optimize


it crashes with segmentation fault

      Screening Tolerance Information
      -------------------------------
          Density screening/tol_rho:  1.00D-10
          AO Gaussian exp screening on grid/accAOfunc:  14
          CD Gaussian exp screening on grid/accCDfunc:  20
          XC Gaussian exp screening on grid/accXCfunc:  20
          Schwarz screening/accCoul:  1.00D-08


      Superposition of Atomic Density Guess
      -------------------------------------


Program received signal SIGSEGV: Segmentation fault - invalid memory reference.

Backtrace for this error:
#0  0x7f9fe91462da in ???
#1  0x7f9fe9145503 in ???
#2  0x7f9fe89daf1f in ???
#3  0x56544037587f in ???
#4  0x565440373b56 in ???
#5  0x56544036fc20 in ???
#6  0x565440370122 in ???
#7  0x56544036e94e in ???
#8  0x56544036ee0c in ???
#9  0x5654402e6f48 in scf_vectors_guess_
        at /home/mgi/distributives/nwchem-6.8.1/src/ddscf/scf_vec_guess.F:179
#10  0x565440035733 in dft_scf_
        at /home/mgi/distributives/nwchem-6.8.1/src/nwdft/scf_dft/dft_scf.F:802
#11  0x56544003209d in dft_main0d_
        at /home/mgi/distributives/nwchem-6.8.1/src/nwdft/scf_dft/dft_main0d.F:641
#12  0x56544001fb9d in nwdft_
        at /home/mgi/distributives/nwchem-6.8.1/src/nwdft/nwdft.F:394
#13  0x56543ffa4b5b in dft_energy_gradient_
        at /home/mgi/distributives/nwchem-6.8.1/src/nwdft/dftgrad/grad_dft.F:11
#14  0x56543fdfcec1 in task_gradient_doit_
        at /home/mgi/distributives/nwchem-6.8.1/src/task/task_gradient.F:360
#15  0x56543fdff32c in task_gradient_
        at /home/mgi/distributives/nwchem-6.8.1/src/task/task_gradient.F:120
#16  0x56543ff1eba0 in driver_
        at /home/mgi/distributives/nwchem-6.8.1/src/driver/opt_drv.F:76
#17  0x56543fe001ab in task_optimize_
        at /home/mgi/distributives/nwchem-6.8.1/src/task/task_optimize.F:146
#18  0x56543fdea7d4 in task_
        at /home/mgi/distributives/nwchem-6.8.1/src/task/task.F:384
#19  0x56543fddff06 in nwchem
        at /home/mgi/distributives/nwchem-6.8.1/src/nwchem.F:313
#20  0x56543fde0550 in main
        at /home/mgi/distributives/nwchem-6.8.1/src/nwchem.F:397
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 0 on node dynamics exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

What I did wrong? Help me please fix this problem.