Get error: Last System Error Message from Task 0:: No such file or directory


Click here for full thread
Clicked A Few Times
Hi all,

NWChem-6.8 was recently built on the computing cluster at my institution, and I am trying to run a simple test input file from the NWChem manual just to make sure everything is working. I am attempting to run the following input on a single core:


scratch_dir /n/home03/gstec/tests
permanent_dir /n/home03/gstec/tests

start h2o_freq
charge 1
geometry units angstroms
 O       0.0  0.0  0.0
 H       0.0  0.0  1.0
 H       0.0  1.0  0.0
end
basis
 H library sto-3g
 O library sto-3g
end
scf
 uhf; doublet
 print low
end
title "H2O+ : STO-3G UHF geometry optimization"
task scf optimize
basis
 H library 6-31g**
 O library 6-31g**
end
title "H2O+ : 6-31g** UMP2 geometry optimization"
task mp2 optimize
mp2; print none; end
scf; print none; end
title "H2O+ : 6-31g** UMP2 frequencies"
task mp2 freq


I have run this exact input on my personal computer, and it runs fine (granted, SCF convergence is not reached, but it terminates normally). I then run this through a slurm queue by the following command:


srun -n 1 nwchem water.nw >& test.log


and I get the following error at the end of the output:

  

                                 NWChem SCF Module
                                 -----------------


                      H2O+ : STO-3G UHF geometry optimization



  ao basis        = "ao basis"
  functions       =     7
  atoms           =     3
  alpha electrons =     5
  beta  electrons =     4
  charge          =   1.00
  wavefunction    = UHF
  input vectors   = atomic
  output vectors  = /n/home03/gstec/tests/h2o_freq.movecs
  use symmetry    = T
  symmetry adapt  = T

Last System Error Message from Task 0:: No such file or directory
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 4 DUP FROM 0
with errorcode 11.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------


Details of the cluster and conditions under which it was built:
CentOS7
OpenMPI 2.0.1
GCC Version 7.1.0
Intel Math Kernel Library 2017.2.174

Environmental variables for compiling:

export NWCHEM_TOP="$FASRCSW_DEV"/rpmbuild/BUILD/%{name}-%{version}-release/
export USE_MPI=y
export NWCHEM_TARGET=LINUX64  
export USE_PYTHONCONFIG=y  
export PYTHONVERSION=2.7
export ARMCI_NETWORK=OPENIB
export BLASOPT="-L${MKL_HOME}/lib/intel64 -lmkl_intel_ilp64 -lmkl_core -lmkl_sequential -lpthread -lm"
export SCALAPACK="-L${MKL_HOME}/lib/intel64 -lmkl_scalapack_ilp64 -lmkl_intel_ilp64 -lmkl_core -lmkl_sequential -lmkl_blacs_intelmpi_ilp64 -lpthread -lm"
export PYTHONHOME=/usr


Any information on this error and how to resolve it would be greatly appreciated. Please let me know if you need any other information from me.