MLIP (Machine Learning Interatomic Potential) plugins for Gaussian 16 External interface.
Four model families are currently supported:
- UMA (fairchem) — default model:
uma-s-1p1 - ORB (orb-models) — default model:
orb_v3_conservative_omol - MACE (mace) — default model:
MACE-OMOL-0 - AIMNet2 (aimnetcentral) — default model:
aimnet2
All backends provide energy, gradient, and analytical Hessian for Gaussian 16.
An optional implicit-solvent correction (xTB) is also available via --solvent.
The model server starts automatically and stays resident, so repeated calls during optimization are fast.
Requires Python 3.9 or later.
If you use ORCA, see also: https://github.com/t-0hmura/orca-mlips
- Install PyTorch suitable for your CUDA environment.
pip install torch==2.8.0 --index-url https://download.pytorch.org/whl/cu129- Install the package with the UMA profile. If you need ORB/MACE/AIMNet2, use
g16-mlips[orb]/g16-mlips[mace]/g16-mlips[aimnet2].
pip install "g16-mlips[uma]"- Log in to Hugging Face for UMA model access. (Not required for ORB/MACE/AIMNet2)
huggingface-cli loginUMA model is on Hugging Face Hub. You need to log in once (See https://github.com/facebookresearch/fairchem):
- Use in a Gaussian input file (
nomicrois required). If you use ORB/MACE/AIMNet2, useexternal="orb"/external="mace"/external="aimnet2". For detailed GaussianExternalusage, see https://gaussian.com/external/
%nprocshared=8
%mem=32GB
%chk=water_ext.chk
#p external="uma" opt(nomicro)
Water external UMA example
0 1
O 0.000000 0.000000 0.000000
H 0.758602 0.000000 0.504284
H -0.758602 0.000000 0.504284
Other backends:
#p external="orb" opt(nomicro)
#p external="mace" opt(nomicro)
#p external="aimnet2" opt(nomicro)
Important: For Gaussian
Externalgeometry optimization, always includenomicroinopt(...). Without it, Gaussian uses micro-iterations that rely on an internal gradient routine, which is incompatible with the external interface. ONIOM is supported with bothActiveAtomsandAllAtoms. WithAllAtoms, Gaussian passesIAn=0MM point-charge rows to the plugin. By default theseIAn=0rows are excluded from MLIP evaluation and returned as zero force/Hessian blocks.
For ONIOM point-charge embedding correction via xTB:
#p oniom(external=("uma --embedcharge",AllAtoms):amber=softfirst) opt(nomicro)
--embedcharge adds an xTB point-charge embedding correction using ONIOM MM charges.
When igrd=2, it also returns MM point-charge force/Hessian terms from the embedding correction.
Optimization and IRC can run without providing an initial Hessian — Gaussian builds one internally using estimated force constants. Providing an MLIP analytical Hessian via freq + readfc improves convergence, especially for TS searches.
Gaussian freq (with external=...) is the only job type that requests the plugin's analytical Hessian directly.
Frequency calculation
%nprocshared=8
%mem=32GB
%chk=cla_ext.chk
#p external="uma" freq
CLA freq UMA
0 1
...
Gaussian sends igrd=2 and stores the result in the .chk file.
You can use an implicit-solvent correction via xTB. To use it, install xTB and pass the --solvent option to external.
Install xTB in your conda environment (or built from source):
conda install xtbUse --solvent <name> in external="..." (examples: water, thf):
#p external="uma --solvent water" opt(nomicro)
#p external="uma --solvent thf" freq
For details, see SOLVENT_EFFECTS.md.
This implementation follows the solvent-correction approach described in: Zhang, C., Leforestier, B., Besnard, C., & Mazet, C. (2025). Pd-catalyzed regiodivergent arylation of cyclic allylboronates. Chemical Science, 16, 22656-22665. https://doi.org/10.1039/d5sc07577g
If citing this correction in a paper, you can use the following:
Implicit solvent effects were accounted for by integrating the ALPB [or CPCM-X] solvation model from the xtb package as an additional correction to UMA-generated energies, gradients, and Hessians.
Note:
--solvent-model cpcmx(CPCM-X) requires xTB built from source with-DWITH_CPCMX=ON. The conda-forgextbpackage does not include CPCM-X support. SeeSOLVENT_EFFECTS.mdfor build instructions.
To use the MLIP analytical Hessian in opt/irc, read the Hessian from an existing checkpoint using Gaussian %oldchk + readfc.
%nprocshared=8
%mem=32GB
%chk=cla_ext.chk
%oldchk=cla_ext.chk
#p external="uma" opt(readfc,nomicro)
CLA opt UMA
0 1
...
readfc reads the force constants from %oldchk. This applies to opt and irc runs.
Note that freq is the only job type that requests the analytical Hessian (igrd=2) from the plugin. opt and irc themselves never request it directly.
pip install "g16-mlips[uma]" # UMA (default)
pip install "g16-mlips[orb]" # ORB
pip install "g16-mlips[mace]" # MACE
pip install "g16-mlips[orb,mace]" # ORB + MACE
pip install "g16-mlips[aimnet2]" # AIMNet2
pip install "g16-mlips[orb,mace,aimnet2]" # ORB + MACE + AIMNet2
pip install g16-mlips # core onlyNote: UMA and MACE have a dependency conflict (
e3nn). Use separate environments.
Local install:
git clone https://github.com/t-0hmura/g16-mlips.git
cd g16-mlips
pip install ".[uma]"Model download notes:
- UMA: Hosted on Hugging Face Hub. Run
huggingface-cli loginonce. - ORB / MACE / AIMNet2: Downloaded automatically on first use.
- UMA / FAIR-Chem: https://github.com/facebookresearch/fairchem
- ORB / orb-models: https://github.com/orbital-materials/orb-models
- MACE: https://github.com/ACEsuit/mace
- AIMNet2: https://github.com/isayevlab/aimnetcentral
See OPTIONS.md for backend-specific tuning parameters.
For solvent correction options, see SOLVENT_EFFECTS.md.
Command aliases:
- Short:
uma,orb,mace,aimnet2 - Prefixed:
g16-mlips-uma,g16-mlips-orb,g16-mlips-mace,g16-mlips-aimnet2
external="uma"runs the wrong plugin — Useexternal="g16-mlips-uma"to avoid alias conflicts.external="aimnet2"runs the wrong plugin — Useexternal="g16-mlips-aimnet2"to avoid alias conflicts.umacommand not found — Activate the conda environment where the package is installed.- UMA model download fails (401/403) — Run
huggingface-cli login. Some models require access approval on Hugging Face. - Works interactively but fails in PBS jobs — Use absolute path from
which umain the Gaussian input.
If you use this package, please cite:
@software{ohmura2026g16mlips,
author = {Ohmura, Takuto},
title = {g16-mlips},
year = {2026},
month = {2},
version = {1.1.0},
url = {https://github.com/t-0hmura/g16-mlips},
license = {MIT},
doi = {10.5281/zenodo.18717988}
}- Gaussian External interface (official): https://gaussian.com/external/
- Gaussian External:
$g16root/g16/doc/extern.txt,$g16root/g16/doc/extgau