DFT input parse error


Click here for full thread
Forum Vet
It looks like NWChem is running successfully, despite the error message. The key output is "dft optimize failed". This could be because the DFT energy did not converge because it needs more iterations (default is 30) or that the optimization did not convergence because it needs more iterations (default is 20 I think). You should look through the last part of your output file (for example search for @, this should give you the information for the optimization steps taken. If there is no @ it means the DFT energy failed to converge)

Bert


[QUOTE=Bk.ong Jun 30th 2:17 pm]Here's the latest error after recompile nwchem with gcc.



                           number of included internuclear angles:        120
==============================================================================



------------------------------------------------------------------------
dft optimize failed 0
------------------------------------------------------------------------
------------------------------------------------------------------------
current input line :
150: TASK DFT OPTIMIZE
------------------------------------------------------------------------
------------------------------------------------------------------------
This type of error is most commonly associatated with calculations not reaching convergence criteria
------------------------------------------------------------------------
For more information see the NWChem manual at http://nwchemgit.github.io/index.php/NWChem_Documentation


For further details see manual section:                                                                                                                                                                                                                                                                
0:0:dft optimize failed:: 0
(rank:0 hostname:wn-ib-004.biruni.upm.my pid:29788):ARMCI DASSERT fail. armci.c:ARMCI_Error():260 cond:0


Last System Error Message from Task 0:: Inappropriate ioctl for device


MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 0.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.




mpiexec has exited due to process rank 0 with PID 29788 on
node wn-ib-004.biruni.upm.my exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpiexec (as reported here).