LAAMPS
Large-scale Atomic/Molecular Massively Parallel Simulator
LAMMPS, or Large-scale Atomic/Molecular Massively Parallel Simulator, is a program intended for materials modeling using molecular dynamics. The program is capable of running simulations for solid-state materials such as metals or semiconductors, soft matter like biological materials and polymers as well as coarse-grained and mesoscopic systems. LAMMPS is also capable of performing parallel particle simulations at several different scales including the atomic scale, mesoscale and continuum scale. There are two options for running LAMMPS, Serial and Parallel. The Parallel version runs using message-passing and can run with GNU OpenMPI or the GNU MVAPICH2 compilers at the RCC HPC Cluster.
Running LAMMPS Examples on RCC Systems#
LAMMPS examples are available only on login nodes. If you would like to run the examples, visit the LAMMPS official GitHub and copy the in.x and data.x files to your home directory (Copy and Paste the Raw Files) where x is the name of the example you want to run:
Example - Serial Job#
We will use the body example here and copy it to the home directory first:
- Download in.body from LAMMPS GitHub (Copy and paste the Raw File)
- Download data.body from LAMMPS GitHub (Copy and paste the Raw File)
- Ensure that the in.body and data.body files are in the correct directory.
Following SLURM submit script will run this example using one CPU core.
Example - Parallel Job#
We will use the same body example here.
Following submit script will use 4 processes to run this example.
Note that if you want to use the SELF option of jump in a parallel job, you have to change the way you feed the input
file to your LAMMPS binary.
Therefore, the last line of the above script should be:
LAAMP Modules#
There are a number of modules available for LAMMPS which can extend or enhance its capabilities. RCC has a number of these available for use across our three versions of LAMMPS (GNU, Intel and PGI).
Module | GNU | Intel | PGI |
---|---|---|---|
ASPHERE | Yes | Yes | Yes |
BODY | Yes | Yes | Yes |
CLASS2 | Yes | Yes | Yes |
COLLOID | Yes | Yes | Yes |
COMPRESS | No | No | No |
CORESHELL | Yes | Yes | Yes |
DIPOLE | Yes | Yes | Yes |
GPU | No | No | No |
GRANULAR | Yes | Yes | Yes |
KIM | No | No | No |
KOKKOS | No | No | No |
KSPACE | Yes | Yes | Yes |
LATTE | No | No | No |
MANYBODY | Yes | Yes | Yes |
MC | Yes | Yes | Yes |
MEAM | Yes | Yes | Yes |
MISC | Yes | Yes | Yes |
MOLECULE | Yes | Yes | Yes |
MPIIO | No | No | No |
MSCG | No | No | No |
OPT | Yes | Yes | Yes |
PERI | Yes | Yes | Yes |
POEMS | No | No | No |
PYTHON | Yes | Yes | Yes |
QEQ | Yes | Yes | Yes |
REAX | Yes | Yes | Yes |
REPLICA | Yes | Yes | Yes |
RIGID | Yes | Yes | Yes |
SHOCK | Yes | Yes | Yes |
SNAP | Yes | Yes | Yes |
SRD | Yes | Yes | Yes |
Voronoi | Yes | Yes | Yes |
USER-ATC | No | No | No |
USER-AWPMD | No | No | No |
USER-BOCS | Yes | Yes | Yes |
USER-CGDNA | Yes | Yes | Yes |
USER-CGSDK | Yes | Yes | Yes |
USER-COLVARS | No | No | No |
USER-DIFFRACTION | Yes | Yes | Yes |
USER-DPD | Yes | Yes | Yes |
USER-DRUDE | Yes | Yes | Yes |
USER-EFF | Yes | Yes | Yes |
USER-FEP | Yes | Yes | Yes |
USER-H5MD | No | No | No |
USER-INTEL | No | Yes | No |
USER-LB | No | No | No |
USER-MANIFOLD | Yes | Yes | Yes |
USER-MEAMC | Yes | Yes | Yes |
USER-MESO | Yes | Yes | Yes |
USER-MGPT | Yes | Yes | No |
USER-MISC | Yes | Yes | Yes |
USER-MOFFF | Yes | Yes | Yes |
USER-MOLFILE | No | No | No |
USER-NETCDF | No | No | No |
USER-OMP | Yes | Yes | Yes |
USER-QMMM | No | No | No |
USER-QTB | Yes | Yes | Yes |
USER-QUIP | No | No | No |
USER-REAXC | Yes | Yes | Yes |
USER-SMD | No | No | No |
USER-SMTBQ | Yes | Yes | Yes |
USER-SPH | Yes | Yes | Yes |
USER-TALLY | Yes | Yes | Yes |
USER-UEF | Yes | Yes | Yes |
USER-VTK | Yes | Yes | Yes |