8:40:17 AM PDT - Tue, Jul 3rd 2012 |
|
These are compiler problems that are beyond our control.
1. You will have to hack the makefile. Take a look at the GNUMakefile in the directory where this file lives (src/tce/ccsdtq_lambda). You will see that all .o files are listed in the OBJ_OPTIMIZE. Simply take the offending file out of the list and add another variable OBJ, i.e. OBJ=ccsdtq_lambda2_18_2.o , and now it will be compiled at a lower level of optimization. Don't do it with all files as it will affect the performance of the code.
2. It's a compiler problem. This file is a single but complex routine (it's the complexity of do-loops not the size that creates the optimization problem)and there is no way to split it. Only way is to reduce the optimization level or switch to a different compiler.
3. Same as 2. You could try setting FOPTIONS += -O1 (or -O0) at the end of the GNUMakefile.
Bert
Quote:Marcinz Jul 3rd 9:37 amHello,
Including these modules in the compilation of either 6.1 or 6.0 NWchem, produces this error:
Fatal compilation error: Out of memory asking for 8192.
compilation aborted for ccsdtq_lambda2_18_2.F (code 1)
for big .F files, like this one mentioned here. This happens when using ifort 11.0.x and 11.1.x version of the compilers with -O3. Using -O2 solves the problem for 11.1.x however not for 11.0.x. Using 12.1.x solves the problem completely even with -O3. ulimit -d is unlimited.
My question would be:
1) how could I lower the optimization flag for just those particular files which are too big?
2) how to keep the -O3 flag for 11.0.x compiler and still fix the problem?
3) use -O2 for 11.0.x and fix the problem?
I hope someone will have some more info on this. Thanks! regards.
|