write-failed error on large systems


Clicked A Few Times
Hi,

 I am trying to run a m06-2x/6-31G* dft calculation for a system with 180 atoms. As the system is large, I assign 50 GB memory for the job. My job is however still crashing with the write failed error. Can someone please help me regarding this? 

Here is the relevant portion of the error

eaf_write: rc ne bytes -1999 bytes 524288
eaf_write: rc ne bytes -1999 bytes 524288
0:int2e_packed_buf_write: write failed:Received an Error in Communication
1:int2e_packed_buf_write: write failed:Received an Error in Communication

Thanks
Samar

Forum Vet
it is quite likely that you ran out of disk space.
You might want to switch to direct SCF by adding the direct keyword in the dft input section

dft
 direct
end

https://github.com/nwchemgit/nwchem/wiki/Density-Functional-Theory-for-Molecules#direct-se...

Clicked A Few Times
Thank you so much for the response. I decreased the memory in the nwchem script from 50 GB to 2GB while keeping the total memory for the job across all nodes unchanged (120gb). And that worked.
Can you please tell us a little bit more about the memory allocation procedure? Is the memory value assigned in the script per node, per core or over all cores?

Clicked A Few Times
Thank you so much for the response. I decreased the memory in the nwchem script from 50 GB to 2GB while keeping the total memory for the job across all nodes unchanged (120gb). And that worked.
Can you please tell us a little bit more about the memory allocation procedure? Is the memory value assigned in the script per node, per core or over all cores?


Forum >> NWChem's corner >> General Topics