Is there a systematic way of finding out how much memory is needed?


Click here for full thread
Forum Vet
Not clear, seems to be related to the system. I would try and reduce the memory footprint. The output suggest you do not need that much memory in the first place.

Doing the numbers and info it looks like you are running on 2 processor cores, and each core is on a different node connected by IB? How many cores and how much memory do you have per node? You may be able to run this on a single node.

Bert


Quote:Yesint Nov 18th 8:36 am
I understand the things in theory, but on practice I still can't get it working. Currently I have

memory total 4000 mb

It runs for few hours and than fails. The end of log is the following:


           Memory Information
------------------
Available GA space size is 524244319 doubles
Available MA space size is 65513497 doubles
Length of a trial vector is 9864
Algorithm : Incore multiple tensor contraction
Estimated peak GA usage is 182779852 doubles
Estimated peak MA usage is 6600 doubles

3 smallest eigenvalue differences (eV)


 No. Spin  Occ  Vir  Irrep   E(Vir)    E(Occ)   E(Diff)


   1    1   72   73 a        -0.071    -0.208     3.744
2 1 71 73 a -0.071 -0.239 4.578
3 1 70 73 a -0.071 -0.245 4.747



 Entering Davidson iterations
Restricted singlet excited states

 Iter   NTrls   NConv    DeltaV     DeltaE      Time   
---- ------ ------ --------- --------- ---------
0: error ival=-1
(rank:0 hostname:mesocomte68 pid:30430):ARMCI DASSERT fail. ../../ga-5-1/armci/src/devices/openib/openib.c:armci_server_rdma_strided_to_contig():3239 cond:(rc==0)


As far as I can see from Memory Information I have a lot of free memory, but it still fails. Could you please tell what's wrong? I wonder what is armci_server_rdma_strided_to_contig()...