MACE (Multi-Atomic Cluster Expansion) is a state-of-the-art equivariant interatomic potential that systematically captures many-body interactions (2-body through $n$-body) using symmetric contractions of equivariant features — combining the theoretical rigor of the Atomic Cluster Expansion (ACE) framework with the flexibility of learned message passing, achieving the best accuracy-to-cost ratio among neural network potentials as of 2023–2025.
What Is MACE?
- Definition: MACE (Batatia et al., 2022) builds atomic representations by constructing equivariant features using products of one-particle basis functions (spherical harmonics $ imes$ radial functions), symmetrically contracted over neighboring atoms to form multi-body correlation features. Each message passing layer computes: (1) one-particle messages using neighbor positions and features; (2) symmetric tensor products that capture 2-body, 3-body, ..., $
u$-body correlations in a single operation; (3) equivariant linear mixing and nonlinear gating. The body order $
u$ controls the expressiveness — higher $
u$ captures more complex many-body angular correlations.
- Atomic Cluster Expansion (ACE) Connection: The theoretical foundation is ACE (Drautz, 2019), which proves that any smooth function of local atomic environments can be systematically expanded in terms of many-body correlation functions (cluster basis functions). MACE implements this expansion using learnable neural network components, providing a complete basis for representing interatomic interactions.
- Equivariant Features: MACE uses irreducible representations of O(3) — scalars ($l=0$), vectors ($l=1$), quadrupoles ($l=2$), octupoles ($l=3$) — to represent the angular character of atomic environments. Tensor products between features of different orders capture angular correlations: a product of two $l=1$ features produces $l=0$ (dot product), $l=1$ (cross product), and $l=2$ (quadrupole) components.
Why MACE Matters
- Accuracy Leadership: MACE achieves the lowest errors on standard molecular dynamics benchmarks (rMD17, 3BPA, AcAc, OC20) as of 2024, outperforming both message-passing models (NequIP, PaiNN, DimeNet++) and strictly local models (Allegro, ACE). The systematic many-body expansion provides a principled path to arbitrarily high accuracy by increasing the body order.
- Foundation Model Potential: MACE-MP-0, trained on the Materials Project database (150,000+ inorganic materials), serves as a universal interatomic potential — accurately simulating any combination of elements across the periodic table without per-system training. This "foundation model" approach parallels the success of large language models: train once on diverse data, then apply to any chemistry.
- Systematic Improvability: Unlike generic GNN architectures where the path to improved accuracy is unclear, MACE provides a systematic hierarchy: increasing the body order $
u$, the maximum angular momentum $l_{max}$, or the number of message passing layers provably increases the expressive power. Practitioners can explicitly trade computation for accuracy along this well-defined hierarchy.
- Efficiency: MACE achieves its accuracy with fewer parameters and lower computational cost than comparably accurate alternatives. The symmetric contraction operation is computationally efficient (optimized einsum operations on GPU), and a single MACE message passing layer captures many-body correlations that would require multiple layers in a standard equivariant GNN.
MACE vs. Other Neural Potentials
| Model | Body Order | Equivariance | Key Strength |
|-------|-----------|-------------|-------------|
| SchNet | 2-body (distances only) | Invariant | Simplicity, speed |
| DimeNet | 3-body (distances + angles) | Invariant | Angular resolution |
| PaiNN | 2-body + $l=1$ vectors | $l leq 1$ equivariant | Efficiency, forces |
| NequIP | Many-body via MP layers | Full equivariant | Accuracy on small systems |
| MACE | Explicit $
u$-body correlations | Full equivariant | Best accuracy/cost ratio |
MACE is the systematic molecular force engine — capturing every relevant many-body interaction in atomic systems through a theoretically complete expansion that combines equivariant message passing with cluster expansion mathematics, defining the current state of the art for neural network interatomic potentials.