undefined reference to ' gfortran copy string'


Click here for full thread
Clicked A Few Times
The good news is that the flaw in the parallel version is not fatal. That is, I've carefully compared the outputs of an example run (the md/ache set) comparing the nwchem version bundled with ecce (1 processor) to my compilaton (1 processor, no error reports) to my compilation using 4 processors. The results, within round-off errors, are identical.

The symptom is that all four processors are active, then three drop out, while the remaining one carries on at 100% until I stop with ctrl-C.

The bad news is that I re-compiled (apparently successfully) for mpi (openmpi in on my system). But I get mpi messaging errors when I try to run it. For eample,

bobw@winter-linux: ~...md/ache_6_mpi [216]> mpirun -np 4 $NWCHEM_TOP/bin/LINUX64/nwchem mache.nw
mpirun -np 4 /usr/local/lib/NWChem/nwchem-6.1-mpi/bin/LINUX64/nwchem mache.nw
argument  1 = mache.nw
0:Segmentation Violation error, status=: 11
1:Segmentation Violation error, status=: 11
(rank:1 hostname:winter-linux pid:28340):ARMCI DASSERT fail. ../../ga-5-1/armci/src/common/signaltrap.c:SigSegvHandler():310 cond:0
2:Segmentation Violation error, status=: 11
(rank:2 hostname:winter-linux pid:28341):ARMCI DASSERT fail. ../../ga-5-1/armci/src/common/signaltrap.c:SigSegvHandler():310 cond:0
3:Segmentation Violation error, status=: 11
(rank:3 hostname:winter-linux pid:28342):ARMCI DASSERT fail. ../../ga-5-1/armci/src/common/signaltrap.c:SigSegvHandler():310 cond:0
Last System Error Message from Task 1:: No such file or directory
Last System Error Message from Task 2:: No such file or directory
Last System Error Message from Task 3:: No such file or directory


MPI_ABORT was invoked on rank 3 in communicator MPI COMMUNICATOR 4 DUP FROM 0
with errorcode 11.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
64/nwchem mache.nw
.
.
.