Bug in MD run of VASP5.3
Posted: Tue Jan 28, 2014 7:43 am
Hey everyone, I am using vasp5.3 to do some MD calculation but some error happened.
I am using ifort and impi to compile and run vasp5.3. It is fine with a regular run but I found some bug when I am using Advanced MD techniques Module.
Does anyone knows how to fix it ? I can provide my Makefile if necessary.
Here is my INCAR:
SYSTEM = DIAMOND
STARTPARAMETER FOR THIS RUN:
NWRITE = 1; LPETIM=F WRITE-FLAG & TIMER
ISTART = 0 JOB : 0-NEW 1-CONT 2-SAMECUT
ELECTRONIC RELAXATION 1
ENCUT = 300 EV
NELM = 200
EDIFF = 1E-05 STOPPING-CRITERION FOR ELM
NELMIN = 4
BMIX = 2.00
ISPIN = 1
IONIC RELAXATION
NBLOCK = 1; KBLOCK = 100 INNER BLOCK; OUTER BLOCK
IBRION = 0 IONIC RELAX: 0-MD 1-QUASI-NEW 2-CG
ISIF = 3 CALCULATE STRESS WITH CONSTANT UNIT CELL VOLUME
ISYM = 0
LCORR = T HARRIS-CORRECTION TO FORCES
TEBEG = 300
TEEND = 300
SMASS = 0.1 NOSE MASS
POTIM = 2 TIME STEP IN FS
IWAVPR = 12
EDIFFG = 0.1E-4
ELECTRONIC RELAXATION 2
IALGO = 48 ALGORITHM
LDIAG = T SUB-SPACE DIAGONALISATION
LREAL = F REAL-SPACE PROJECTION
WEIMIN = 0
PREC = NORMAL
NBANDS = 256
LWAVE = .FALSE.
LCHARG = .FALSE.
APACO = 10.0 ! DISTANCE FOR P.C.
NSW = 2000
MDALGO=3
LBLUEOUT=.TRUE.
LANGEVIN_GAMMA_L=5
Error looks like follow:
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xbc1ed40 src=0xbc1ed40 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xaf9ad50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xaf5ad50, rcounts=0xacdc400, displs=0xacdc450, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xaf9ad50 src=0xaf9ad50 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc4add50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc44dd50, rcounts=0xc1cf440, displs=0xc1cf490, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc4add50 src=0xc4add50 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc1fae30, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc17ae30, rcounts=0xbf0f560, displs=0xbf0f5b0, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc1fae30 src=0xc1fae30 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xaea3e30, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xae03e30, rcounts=0xab98560, displs=0xab985b0, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xaea3e30 src=0xaea3e30 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc38ccf0, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc2cccf0, rcounts=0xc04e440, displs=0xc04e490, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc38ccf0 src=0xc38ccf0 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xb712d50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xb632d50, rcounts=0xb3b4400, displs=0xb3b4450, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xb712d50 src=0xb712d50 len=131072
I am using ifort and impi to compile and run vasp5.3. It is fine with a regular run but I found some bug when I am using Advanced MD techniques Module.
Does anyone knows how to fix it ? I can provide my Makefile if necessary.
Here is my INCAR:
SYSTEM = DIAMOND
STARTPARAMETER FOR THIS RUN:
NWRITE = 1; LPETIM=F WRITE-FLAG & TIMER
ISTART = 0 JOB : 0-NEW 1-CONT 2-SAMECUT
ELECTRONIC RELAXATION 1
ENCUT = 300 EV
NELM = 200
EDIFF = 1E-05 STOPPING-CRITERION FOR ELM
NELMIN = 4
BMIX = 2.00
ISPIN = 1
IONIC RELAXATION
NBLOCK = 1; KBLOCK = 100 INNER BLOCK; OUTER BLOCK
IBRION = 0 IONIC RELAX: 0-MD 1-QUASI-NEW 2-CG
ISIF = 3 CALCULATE STRESS WITH CONSTANT UNIT CELL VOLUME
ISYM = 0
LCORR = T HARRIS-CORRECTION TO FORCES
TEBEG = 300
TEEND = 300
SMASS = 0.1 NOSE MASS
POTIM = 2 TIME STEP IN FS
IWAVPR = 12
EDIFFG = 0.1E-4
ELECTRONIC RELAXATION 2
IALGO = 48 ALGORITHM
LDIAG = T SUB-SPACE DIAGONALISATION
LREAL = F REAL-SPACE PROJECTION
WEIMIN = 0
PREC = NORMAL
NBANDS = 256
LWAVE = .FALSE.
LCHARG = .FALSE.
APACO = 10.0 ! DISTANCE FOR P.C.
NSW = 2000
MDALGO=3
LBLUEOUT=.TRUE.
LANGEVIN_GAMMA_L=5
Error looks like follow:
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xbc1ed40 src=0xbc1ed40 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xaf9ad50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xaf5ad50, rcounts=0xacdc400, displs=0xacdc450, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xaf9ad50 src=0xaf9ad50 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc4add50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc44dd50, rcounts=0xc1cf440, displs=0xc1cf490, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc4add50 src=0xc4add50 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc1fae30, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc17ae30, rcounts=0xbf0f560, displs=0xbf0f5b0, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc1fae30 src=0xc1fae30 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xaea3e30, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xae03e30, rcounts=0xab98560, displs=0xab985b0, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xaea3e30 src=0xaea3e30 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc38ccf0, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc2cccf0, rcounts=0xc04e440, displs=0xc04e490, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc38ccf0 src=0xc38ccf0 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xb712d50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xb632d50, rcounts=0xb3b4400, displs=0xb3b4450, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xb712d50 src=0xb712d50 len=131072