1. Easy install#

There are various easy methods to install DeePMD-kit. Choose one that you prefer. If you want to build by yourself, jump to the next two sections.

After your easy installation, DeePMD-kit (dp) and LAMMPS (lmp) will be available to execute. You can try dp -h and lmp -h to see the help. mpirun is also available considering you may want to train models or run LAMMPS in parallel.

Note

Note: The off-line packages and conda packages require the GNU C Library 2.17 or above. The GPU version requires compatible NVIDIA driver to be installed in advance. It is possible to force conda to override detection when installation, but these requirements are still necessary during runtime. You can refer to DeepModeling conda FAQ for more information.

Note

Python 3.9 or above is required for Python interface.

1.1. Install off-line packages#

Both CPU and GPU version offline packages are available on the Releases page.

Some packages are split into two files due to the size limit of GitHub. One may merge them into one after downloading:

cat deepmd-kit-2.2.9-cuda118-Linux-x86_64.sh.0 deepmd-kit-2.2.9-cuda118-Linux-x86_64.sh.1 > deepmd-kit-2.2.9-cuda118-Linux-x86_64.sh

One may enable the environment using

conda activate /path/to/deepmd-kit

1.2. Install with conda#

DeePMD-kit is available with conda. Install Anaconda, Miniconda, or miniforge first. You can refer to DeepModeling conda FAQ for how to setup a conda environment.

1.2.1. conda-forge channel#

DeePMD-kit is available on the conda-forge channel:

conda create -n deepmd deepmd-kit lammps horovod -c conda-forge

The supported platforms include Linux x86-64, macOS x86-64, and macOS arm64. Read conda-forge FAQ to learn how to install CUDA-enabled packages.

1.2.2. Official channel (deprecated)#

Danger

Deprecated since version 3.0.0: The official channel has been deprecated since 3.0.0, due to the challenging work of building dependencies for multiple backends. Old packages will still be available at https://conda.deepmodeling.com. Maintainers will build packages in the conda-forge organization together with other conda-forge members.

1.3. Install with docker#

A docker for installing the DeePMD-kit is available here.

To pull the CPU version:

docker pull ghcr.io/deepmodeling/deepmd-kit:2.2.8_cpu

To pull the GPU version:

docker pull ghcr.io/deepmodeling/deepmd-kit:2.2.8_cuda12.0_gpu

1.4. Install Python interface with pip#

Create a new environment, and then execute the following command:

pip install deepmd-kit[gpu,cu12]

cu12 is required only when CUDA Toolkit and cuDNN were not installed.

pip install deepmd-kit-cu11[gpu,cu11]
pip install deepmd-kit[cpu]

The LAMMPS module and the i-PI driver are only provided on Linux and macOS for the TensorFlow backend. To install LAMMPS and/or i-PI, add lmp and/or ipi to extras:

pip install deepmd-kit[gpu,cu12,lmp,ipi]

MPICH is required for parallel running.

pip install deepmd-kit[torch]
pip install torch --index-url https://download.pytorch.org/whl/cu118
pip install deepmd-kit-cu11
pip install torch --index-url https://download.pytorch.org/whl/cpu
pip install deepmd-kit
pip install deepmd-kit[jax] jax[cuda12]
pip install deepmd-kit[jax]

To generate a SavedModel and use the LAMMPS module and the i-PI driver, you need to install the TensorFlow. Switch to the TensorFlow TensorFlow tab for more information.

The supported platform includes Linux x86-64 and aarch64 with GNU C Library 2.28 or above, macOS x86-64 and arm64, and Windows x86-64.

Warning

If your platform is not supported, or you want to build against the installed backends, or you want to enable ROCM support, please build from source.