Decreasing KPOINTS and/or ENCUT in a NEB run

Vasp transition state theory tools

Moderator: moderators

Post Reply
roschrod
Posts: 3
Joined: Fri Mar 18, 2016 12:57 am

Decreasing KPOINTS and/or ENCUT in a NEB run

Post by roschrod »

Good morning to everyone.
I'm not a really experienced user of VASP and so I'm asking for a bit of help.

Basically, I'm dealing with a supported metallic cluster (slab calculations, with approximately 100 atoms). After benchmarking, I've chosen a 4x4x1 k-point mesh and an ENCUT value of 500 eV. Everything seems to run fine, regarding the energy minimization; however, when performing NEB calculations (7 images) it seems the number of processors and/or memory reserved is not enough to handle the calculation (16 processors per image). Indeed the program exit abruptly without entering the first SCF cycle. I'm pretty sure that it's something related to memory, since I've run the same calculation on external resources with more powerful machines and everything runs perfectly.

I arrive to the point. Both for solving this problem but also for general speed-up, is it generally accepted to decrease the number of k-points and/or ENCUT for the solely NEB run, maintaining as end points the structures optimized at a higher level of accuracy? If yes, I guess I have to reoptimize the endpoints with the new set of parameters, insn't it? A final consideration: dealing with extremely localized processes (bond breaking occuring onto a cluster) is it correct to argue that decreasing the k-points toward the extreme limit of the gamma point it won't affect so much the energy barrier?

Thanks in advance.
graeme
Site Admin
Posts: 1999
Joined: Tue Apr 26, 2005 4:25 am
Contact:

Re: Decreasing KPOINTS and/or ENCUT in a NEB run

Post by graeme »

My personal view is that many people do DFT calculations that are far too precise, in the sense that they aim to converge the total DFT energy with too much precision. Remember, the only thing that matters are energy differences (barriers and binding energies). Also remember that there are relatively large errors associated with the approximations associated with DFT.

Check your k-point sampling and energy cutoff for some relative energy (e.g. a binding energy). I would be surprised if you could not get away with soft potentials (_s versions), an energy cutoff less than 300 eV, and a k-point sampling of 2x2x1. Then, do the NEB calculation with these minimal settings.

Two additional points: (i) going from a 500 eV cutoff to 300 eV and a 4x4x1 to 2x2x1 sampling should reduce the cost of your calculation by about an order of magnitude and (ii) it is trivial to increase the precision once you have found the important minima and saddle points; finding these critical points with as inexpensive settings as possible is the way to go.
roschrod
Posts: 3
Joined: Fri Mar 18, 2016 12:57 am

Re: Decreasing KPOINTS and/or ENCUT in a NEB run

Post by roschrod »

Thank you.

But as regard your last point I didn't mean to increase the precision once the saddle and minima point were found but the opposite, that is to say decrease the k-points in a NEB calculation starting with end points obtained with a more accurate set of parameters.
graeme
Site Admin
Posts: 1999
Joined: Tue Apr 26, 2005 4:25 am
Contact:

Re: Decreasing KPOINTS and/or ENCUT in a NEB run

Post by graeme »

Yes, I understand. I'm simply advocating for the determination of endpoints and saddles using as modest computational settings as possible. This let's you explore the potential energy surface as quickly as possible so that you can understand the important physics/chemistry. Then, when you know which reactions are most important, it is trivial to increase the computational settings and reconverge your minima and saddles with higher precision.
roschrod
Posts: 3
Joined: Fri Mar 18, 2016 12:57 am

Re: Decreasing KPOINTS and/or ENCUT in a NEB run

Post by roschrod »

I understand your point of view. Thank you.
Post Reply