You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-[Breaking] training restart behavior altered: file-wise consistency checks performed between original config and config passed to `nequip-train` on restart (instead of checking the config dicts)
17
+
-[Breaking] config format for callbacks changed (see `configs/full.yaml` for an example)
Copy file name to clipboardExpand all lines: README.md
+3-1Lines changed: 3 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -141,7 +141,9 @@ Details on writing and using plugins can be found in the [Allegro tutorial](http
141
141
142
142
## References & citing
143
143
144
-
The theory behind NequIP is described in our preprint (1). NequIP's backend builds on e3nn, a general framework for building E(3)-equivariant neural networks (2). If you use this repository in your work, please consider citing NequIP (1) and e3nn (3):
144
+
The theory behind NequIP is described in our [article](https://www.nature.com/articles/s41467-022-29939-5) (1).
145
+
NequIP's backend builds on [`e3nn`](https://e3nn.org), a general framework for building E(3)-equivariant
146
+
neural networks (2). If you use this repository in your work, please consider citing `NequIP` (1) and `e3nn` (3):
If you use ``NequIP`` in your research, please cite our `article <https://doi.org/10.1038/s41467-022-29939-5>`_:
3
4
5
+
.. code-block:: bibtex
6
+
7
+
@article{batzner_e3-equivariant_2022,
8
+
title = {E(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials},
9
+
author = {Batzner, Simon and Musaelian, Albert and Sun, Lixin and Geiger, Mario and Mailoa, Jonathan P. and Kornbluth, Mordechai and Molinari, Nicola and Smidt, Tess E. and Kozinsky, Boris},
10
+
year = {2022},
11
+
month = may,
12
+
journal = {Nature Communications},
13
+
volume = {13},
14
+
number = {1},
15
+
pages = {2453},
16
+
issn = {2041-1723},
17
+
doi = {10.1038/s41467-022-29939-5},
18
+
}
19
+
20
+
The theory behind NequIP is described in our `article <https://doi.org/10.1038/s41467-022-29939-5>`_ above.
21
+
NequIP's backend builds on `e3nn <https://e3nn.org>`_, a general framework for building E(3)-equivariant
22
+
neural networks (1). If you use this repository in your work, please consider citing ``NequIP`` and ``e3nn`` (2):
f"!! PyTorch version {torch_version} found. Upstream issues in PyTorch versions 1.13.* and 2.* have been seen to cause unusual performance degredations on some CUDA systems that become worse over time; see https://github.com/mir-group/nequip/discussions/311. The best tested PyTorch version to use with CUDA devices is 1.11; while using other versions if you observe this problem, an unexpected lack of this problem, or other strange behavior, please post in the linked GitHub issue."
help="Warn instead of error when the config contains unused keys",
158
192
action="store_true",
159
193
)
194
+
parser.add_argument(
195
+
"--override",
196
+
help="Override top-level configuration keys from the `--train-dir`/`--model`'s config YAML file. This should be a valid YAML string. Unless you know why you need to, do not use this option.",
0 commit comments