Weird memory requirement

Questions regarding the compilation of VASP on various platforms: hardware, compilers and libraries, etc.


Moderators: Global Moderator, Moderator

Post Reply
Message
Author
chelman
Newbie
Newbie
Posts: 13
Joined: Fri Mar 11, 2011 4:58 pm
License Nr.: 5-793
Location: Buenos Aires, Argentina

Weird memory requirement

#1 Post by chelman » Fri Nov 08, 2013 8:46 pm

There is a line in the OUTCAR file which say:
"total amount of memory used by VASP on root node xxx Kb"
where xxx is a number.
If I decide to run parallel in 2 quad core ( 8 nodes for vasp ) with NPAR = 8 the xxx=1341434.
But If I decide to run (SAME SYSTEM ) in parallel with 4 quad core ( 16 nodes for vasp ) with NPAR=16, the xxx=2085985.
The NPAR is the ONLY difference between these two runs!
What i don't understand is that If I have more cores (nodes) to run why the memory requirement is bigger, it's seems like more cores need more memory?? shouldn't be other way around?

Thanks!!
Last edited by chelman on Fri Nov 08, 2013 8:46 pm, edited 1 time in total.

ledssiul
Newbie
Newbie
Posts: 8
Joined: Sun May 16, 2010 9:10 pm

Weird memory requirement

#2 Post by ledssiul » Mon Nov 11, 2013 3:35 am

I think this behavior is normal and it is already explained in the vasp tutorial. Take a look in the NPAR and LPLANE variable meaning.

http://cms.mpi.univie.ac.at/vasp/guide/node138.html

You will find that an optimum setting of these two variables depends a lot on the type of machine you are running.

Hope it helps,

Regards,

Luis
Last edited by ledssiul on Mon Nov 11, 2013 3:35 am, edited 1 time in total.

chelman
Newbie
Newbie
Posts: 13
Joined: Fri Mar 11, 2011 4:58 pm
License Nr.: 5-793
Location: Buenos Aires, Argentina

Weird memory requirement

#3 Post by chelman » Thu Nov 14, 2013 10:31 pm

Thanks for your help, but still nothing.
Now, I'm asking to the community (specially to the administrator) :
Is there anyway to predict (estimate) the memory requirement per node before run a job??
Is there any account that we can make to estimate the memory ?

Perhaps this question is too general, in the particular case I'm working on, the question can be redefined to:

I'm already converge a spin polarized calculation and I want to perform a spin-orbit one. Knowing the sizes for the spin-polarized calculation, can I predict the size of the SO one?
Last edited by chelman on Thu Nov 14, 2013 10:31 pm, edited 1 time in total.

Post Reply